var/home/core/zuul-output/0000755000175000017500000000000015155013240014521 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015155027045015475 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000365556615155026742020307 0ustar corecore-ikubelet.lognc9r~EYd` \-Hږ%C{sg5݁ϑ(Ӄis$WU+BgHtMz;_^_]q퍷y]956! ޒ/fh=[gSe~}jyQ֫/㴽Z[>jy#/7W ?*ybq/7vxo/hsoSa 7D*(Ϗ__nV] +SI211jqw?|pCe^./A^I}%+F)óóisR)N &enNQ/̪_a0f368/Jʢ ܚʂ9ss3+aôإJ}w;3FEb%]˜(O)X}`\UlxXGJvwxi1M2 c#FD?2SgafO3|,ejoLR3[ D HJP1Ub2 eX"y1j3undd|%3`C\LtiKgz֝dYt wnVB} eRB0R딏]dP>Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9Ӌ:+V 3'Q8M%3KpNGIrND}2SRCK.(^$0^@hH9%!tasKZiu}THW{y|gg WKPW*g,Z0>?<{r.:;.뙘 A|==-$JRPœ*fOԼf^`ig7!)&c(z$5jlUi_η*t:%?vEmO5wtqÜ3Byu '~qlF?}| nLFR6f8yWWxgg ;k44|Ck4UDL v{smQviYDZh$#*)e\W$IAT;K0Gp}=%ڠedۜ+EaH#QtDV:?7c);h0!}j)CMitmy߀~s{:R€A`*t) OW}\+`peGVق?>6E4)%LJno#ˏl_}DN73Sl:f *H7(?PЃkL<;J_O{.Z8Y CEO+^&HqZY PTUJ2dic3w ?YQgpa` 2R;R:gO7A}Ss8 ΁oeor(Ё^g׬Jyach{hr=xԞ%>]CN]cNdusmUSTY|":", 1BiP`3 aezH5^n)}k~hT(d#iI@YUXPKL:3LVY` Zqի8QufiŒSq3<uqMQhiae̱F+,~C민v= 09WAu{kYEPv'nt3vmL=@N'DǭZrb5Iffe6Rh&C4>Qwf8*c4˥ĘP0W YW ].P!_~&N%80=1Jgͤ39(&ʤdH0Ζ@.!)}Gt?=˼l?ff>\fbN2p cL1%'4-1a_`[틎b=SSO|{krk{-3ss`yB}U%:X:@;afU=sru+}K >Y%LwM*t{zƝ$dYr;Owim @tBODɆj>0st\t@HTu( v e`H*1aK`3CmF1K>y5u=kkN2;;#N;md^6%rd9#_~2:Y`&UW*֢v|E}#{u9_))wF|dߗXiTcLQMhg:F[bTm!V`AqPaPheUJ& { Zۻ#es=oi_)8m1`8}gp/G\2u<ΰ+a1tHayɒ aY(P*aaʨ@ΰ<pX X{k[%<~ X^^aM:E.Qg1DllЊE҉L ehJx{ztKmdߟ9 &2vA.:M {bZo:Xko;$UYwS1dӧl 5Yp$'}Zv"ꒄℬT ٪ȿ$jXWFI#R޸B4vOL-LIP E&G`JS[lueX*kj1H`z8F5]WeߵcJ0TTƩ0Bly]e?>+ ђ(9Uq EmFjq1z9_^DןR22;嶑, }t&&\5u17\I@ 5O? ʴ(aPqPϟ'Xa>EE衢^}p/:F?}bi0>Oh%\x(bdF"F 'u Qx`j#(g6zƯRo(lџŤnE7^k(|(4s\9#.\r= (mO(f=rWmd'rDZ~;o\mkmB`s ~7!GdјCyEߖs|n|zu0VhI|/{}BC6q>HĜ]Xgy G[Ŷ.|37xo=N4wjDH>:&EOΆ<䧊1v@b&툒f!yO){~%gq~.LK78F#E01g.u7^Ew_lv۠M0}qk:Lx%` urJp)>I(>z`{|puB"8#YkrZ .`h(eek[?̱ՒOOc&!dVzMEHH*V"MC Qؽ1Omsz/v0vȌJBIG,CNˆ-L{L #cNqgVR2r뭲⭊ڰ08uirP qNUӛ<|߈$m뫷dùB Z^-_dsz=F8jH˽&DUh+9k̈́W^̤F˖.kL5̻wS"!5<@&] WE\wMc%={_bD&k 5:lb69OBCC*Fn) u{Hk|v;tCl2m s]-$zQpɡr~]Si!ڣZmʢ鉗phw j8\c4>0` R?da,ȍ/ءfQ 2ؐfc}l 2窾ۉ1k;A@z>T+DE 6Хm<쉶K`'#NC5CL]5ݶI5XK.N)Q!>zt?zpPC ¶.vBTcm"Bsp rjﺧK]0/k<'dzM2dk–flE]_vE P / څZg`9r| 5W;`.4&XkĴp 6l0Cз5O[{B-bC\/`m(9A< f`mPіpNЦXn6g5m 7aTcTA,} q:|CBp_uFȆx6ڮܷnZ8dsMS^HэUlq 8\C[n膗:68DkM\7"Ǻzfbx]ۮC=1ÓOv$sY6eX%]Y{⦁# &SlM'iMJ았 t% ~@1c@K?k^rEXws zz.8`hiPܮbC7~n b?`CtjT6l>X+,Qb5ȳp`FMeXÅ0+!86{V5y8 M`_Uw ȗkU]a[.D}"\I5/1o٩|U戻,6t錳"EFk:ZM/!ݛ@pRu Iヵvyne 0=HH3n@.>C@{GP 9::3(6e™nvOσ =?6ͪ)Bppًu_w/m/0}T>CUX\!xl=ZVM\aٟ6h㗶E۶{O#X26.Fٱq1M k'JE%"2.*""]8yܑ4> >X1 smD) ̙TީXfnOFg㧤[Lo)[fLPBRB+x7{{? ףro_nն-2n6 Ym^]IL'M+;U t>x]U5g B(, qA9r;$IN&CM(F+ hGI~Q<웰[, qnriY]3_P${,<\V}7T g6Zapto}PhS/b&X0$Ba{a`W%ATevoYFF"4En.O8ϵq\FOXƀf qbTLhlw?8p@{]oOtsϑ`94t1!F PI;i`ޮMLX7sTGP7^s08p15w q o(uLYQB_dWoc0a#K1P,8]P)\wEZ(VҠQBT^e^0F;)CtT+{`Bh"% !.bBQPnT4ƈRa[F=3}+BVE~8R{3,>0|:,5j358W]>!Q1"6oT[ҟ^T;725Xa+wqlR)<#!9!籈K*:!@NI^S"H=ofLx _lp ꖚӜ3C 4dM @x>ۙZh _uoֺip&1ڙʪ4\RF_04H8@>fXmpLJ5jRS}_D U4x[c) ,`̔Dvckk5Ťã0le۞]o~oW(91ݧ$uxp/Cq6Un9%ZxðvGL qG $ X:w06 E=oWlzN7st˪C:?*|kިfc]| &ب^[%F%LI<0(씖;4A\`TQ.b0NH;ݹ/n -3!: _Jq#Bh^4p|-G7|ڸ=Bx)kre_f |Nm8p5H!jR@Aiߒ߈ۥLFTk"5l9O'ϓl5x|_®&&n]#r̥jOڧK)lsXg\{Md-% >~Ӈ/( [ycy`ðSmn_O;3=Av3LA׊onxlM?~n Θ5 ӂxzPMcVQ@ӤomY42nrQ\'"P؝J7g+#!k{paqTԫ?o?VU}aK q;T0zqaj0"2p؋9~bޏt>$AZLk;3qUlWU Ry==qޕ6ql?N/e1N2i ۓ,j|z6OSu;BKŨʐPqO K\{jDiy@}b|Z79ߜih(+PKO;!o\戔-QB EM;oH$$]?4~YrXY%Ο@oHwlXiW\ΡbN}l4VX|"0]! YcVi)@kF;'ta%*xU㔸,A|@WJfVP6`ڼ3qY.[U BTR0u$$hG$0NpF]\ݗe$?# #:001w<{{B\rhGg JGIެE.:zYrY{*2lVǻXEB6;5NE#eb3aīNLd&@yz\?))H;h\ߍ5S&(w9Z,K44|<#EkqTkOtW]﮶f=.*LD6%#-tңx%>MZ'0-bB$ !)6@I<#`L8턻r\Kuz*]}%b<$$^LJ<\HGbIqܢcZW {jfѐ6 QڣPt[:GfCN ILhbB.*IH7xʹǙMVA*J'W)@9 Ѷ6jىY* 85{pMX+]o$h{KrҎl 5sÁbNW\: "HK<bdYL_Dd)VpA@A i"j<鮗 qwc&dXV0e[g#B4x╙✑3'-i{SEȢbK6}{Ⱥi!ma0o xI0&" 9cT)0ߢ5ڦ==!LgdJΆmΉO]T"DĊKٙ@qP,i Nl:6'5R.j,&tK*iOFsk6[E__0pw=͠qj@o5iX0v\fk= ;H J/,t%Rwó^;n1z"8 P޿[V!ye]VZRԾ|“qNpѓVZD2"VN-m2do9 'H*IM}J ZaG%qn*WE^k1v3ڣjm7>ƽl' ,Τ9)%@ wl42iG.y3bBA{pR A ?IEY ?|-nz#}~f ‰dŷ=ɀ,m7VyIwGHέ 2tޞߛM{FL\#a s.3\}*=#uL#]  GE|FKi3&,ۓxmF͉lG$mN$!;ߑl5O$}D~5| 01 S?tq6cl]M[I5'ոfiҞ:Z YՑ"jyKWk^dd@U_a4/vvV qHMI{+']1m]<$*YP7g# s!8!ߐ>'4k7/KwΦθW'?~>x0_>9Hhs%y{#iUI[Gzďx7OnuKRv'm;/~n-KI`5-'YݦD-!+Y򼤙&m^YAKC˴vҢ]+X`iDf?U7_nMBLϸY&0Ro6Qžl+nݷ" 㬙g|ӱFB@qNx^eCSW3\ZSA !c/!b"'9k I S2=bgj쯏W?=`}H0--VV#YmKW^[?R$+ +cU )?wW@!j-gw2ŝl1!iWҊ?b*bj^Tc?m%3-$h`EbDC;.j0X1dR? ^}Ծե4NI ܓR{Omu/~+^K9>lIxpI"wS S 'MV+Z:H2d,P4J8 L72?og1>b$]ObsKx̊y`bE&>XYs䀚EƂ@K?n>lhTm' nܡvO+0fqf٠r,$/Zt-1-dė}2Or@3?]^ʧM <mBɃkQ }^an.Fg86}I h5&XӘ8,>b _ z>9!Z>gUŞ}xTL̵ F8ՅX/!gqwߑZȖF 3U>gCCY Hsc`% s8,A_R$קQM17h\EL#w@>omJ/ŵ_iݼGw eIJipFrO{uqy/]c 2ėi_e}L~5&lҬt񗽐0/λL[H* JzeMlTr &|R 2ӗh$cdk?vy̦7]Ạ8ph?z]W_MqKJ> QA^"nYG0_8`N 7{Puٽ/}3ymGqF8RŔ.MMWrO»HzC7ݴLLƓxxi2mW4*@`tF)Ċ+@@te\0zE|!@E " ;9Ώf3kZc7B 8yݪkIf-8>V#ہll/ؽnA(ȱbAj>C9O n6HNe">0]8@*0)QsUN8t^N+mXU q2EDö0^R) hCt{d}ܜFnԴ.2w⠪R/r| w,?VMqܙ7;qpUۚ5Tnj ۝jlN$q:w$U>tL)NC*<` `)ĉJآS2 z]gQ)Bی:D`W&jDk\7XD&?Y\9ȢG:${1`+i n8=%Ml%İȖb7AޗuV3A7ำqE*\qb'YpuHƩҬV nm=Ɂ-2=|5ʹ zi ' ׹U>8bK0%V\ t!Lku`+]c0h&)IVC)p| QUA:]XL/2La[Xѓ F;/-rtx-rei0hE˝ݸDt#{I} `v;jUvK S x1Q2XU&6k&lE"} Q\E)+u>.,SzbQ!g:l0r5aI`"Ǒm O\B!,ZDbjKM%q%Em(>Hm 2z=Eh^&hBk X%t>g:Y #)#vǷOV't d1 =_SEp+%L1OUaY쎹aZNnDZ6fV{r&ȑ|X!|i*FJT+gj׾,$'qg%HWc\4@'@—>9V*E :lw)e6;KK{s`>3X: P/%d1ؑHͦ4;W\hx锎vgqcU!}xF^jc5?7Ua,X nʬ^Cv'A$ƝKA`d;_/EZ~'*"ȜH*Duƽ˳bKg^raͭ̍*tPu*9bJ_ ;3It+v;3O'CX}k:U{⧘pvzz0V Y3'Dco\:^dnJF7a)AH v_§gbȩ<+S%EasUNfB7™:%GY \LXg3۾4\.?}f kj· dM[CaVۿ$XD'QǛU>UݸoRR?x^TE.1߬VwխmLaF݄",Uy%ífz,/o/Z^]ݖF\\UR7򱺹...m/~q[ /7n!7xB[)9nI [GۿsH\ow!>66}եl?|i [%۾s& Z&el-ɬeb.E)բA l1O,dE>-KjLOgeΏe|Bf".ax)֒t0E)J\8ʁ,Gulʂ+lh)6tqd!eó5d ¢ku|M"kP-&ђ5h ^pN0[|B>+q"/[ڲ&6!%<@fpѻKQ31pxFP>TU?!$VQ`Rc1wM "U8V15> =҆#xɮ}U`۸ہt=|X!~Pu(UeS@%Nb:.SZ1d!~\<}LY aBRJ@ѥuȑz.# 3tl7 ]وb Xnݔ[TN1|ttc‡-5=VrPhE0Ǐ}Wd|\aD;(;Ha.]1-{s1`HbKV$n}Z+sz'ʀ*E%N3o2c06JZW?V g>ed\)g.C]pj|4逜*@ nBID f"!!*7kS4޷V+8弔*A19`RI/Hй qPq3TY'퀜+/Ĥ'cp2\1: 0mtH,.7>\hSؗ΀ѩ آSNEYdEcaLF&"FhQ|![gIK v~,Jc%+8[dI368fp*CDrc3k.2WM:UbX[cO;R`RA]d+w!e rr솜[/V`+@;Τ`5d0ϕ_Lع`C"cK>JG.}Ε00e>& 2䯫vNj31c$ i '2Sn-51Y}rE~b>|Ď6Oj~ebIapul9| 3QtUqSCxTD7U9/nq.JYCtuc nrCtVDƖϧ;INOKx%'t+sFUJq:ǫf!NRT1D(3.8Q;І?O+JL0SU%jfˬ1lމZ|VA/.ȍȱh M-r ~[0AG꠭y*8D*-Rz_z{/S[*"꫒?`a;N6uilLn<Yllmb rY״͆jqTI!j.Pٱh s!:W_´KxA|Hk1nE6=W|$O -{]1Ak$ ѫQ6Plp;3F$RveL l5`:~@c>q,7}VE-Q8W70up˳ A¦g/OEU:غA>?=CۣPqȅlW11/$f*0@б 2Dݘrt +qrx!8 J&[V =͋A,z`S,J|L/vrʑ=}IhM4fG(Ȋ1{TT%41Oa'$ ]ms۸+|_*[EJLfnilwd(u#K^x%q^jFζq @ebu%H2.,~D0/B,$b;}懪9e#B2K#Ky*mhʱ4(LBpB'r ) V1*^uQVTa=K.Q!V$2T<1D}B.؃`JqթP@NՌ1T2n!CUJy«(Lu&N CUx׿j1<s9> /q< A2=1ljB. >arA%P% d|OOIي='nGSe%,+xi|y%ke0~Y\Qˍ+ݗ-'ksuwF-^"ԒNDŕ0#=hCB]*H0oWYl2Bh??^E2y\[w՗d0Xo1<Y6#RڿErO$U讼0[cu i>@%.Ln>;e278 8yU3%-qTMXd z}6SIaX<%,N{T"$(Fs^z1.3q?n=ĶК\EǫGmE6Jů}bE2yGt,aفG~$CǴ q( -|Ϣ:a HXGddOY-_Qv+HQJ-8/2 Jb"O&A6 y#* u|k)EXd>ƚ5Ґig1(DHBSq9yXŨgu#%˘u%QK\_W2Vo_eK `RU'wt7D9xyQNRշ$ꬫX=7NFQV,FqZ-pթel(Zz!J.(%$;̜l|̢jKGIXEv b%䣨j ^Nh_Y4 @*(Gb@b/( > -#/xtWj2Hidף9y>#] | ٞdscdzr0em{t4Lm vAďCAqPN,d蓬Dalg,o?M?*",mA`/MwpqQ((3Џx6/¾ke8x+⅜R(5xxJMĀOh<^~|3џS(n_dU@5Y)#Zپ hϚH٥7l0(@t5&*) )OWVp6|Nlo!;q@q*gJQno6և"ispu0KLsEx_7CgPP,Ėz*+>}|s6@!eB5A$N:u9Kxn 5p|nff/ ůͼLj:-g8΀QA{w=\NG}(0!BOQY?rp?C9X)uzJoP0=Lf;Ɉ $#CݲQ!l}b^(QX=L.7m^OUF%lw7 "wYyUXn &DGJzA* %NS5Bv_6)SgY8.jo@ay^L 1e#pdux\ ']q?Hou6\ DKiWځ(!2\+Ds*?T+EU$:@mӪ|qP'OBۖ7PN|x'\5}CADŎ:GU6u8sfW jDrYeq[%LNޜ6^'|V*${ч#a2E%j]ШӒGR*yi.Y]*RE״4<}E{DbuqHE\X&N֮5p|D]Ը<."HF3E !L=$+XU8<8JKl W8az_KEzӻx 2Ⱙ><`(W6uM$j˷ƖiɆhKTu9=6rc[pU=־?^OQdysd ߘIC1y+Zb[.e D}A] E"9z-ct}xFVftZ~^N24*+]ι t äms?K("|ӷ@xoGY!uy,C쮽 5s!zݬԭu" UnIj  jtnQK=eK!#KS86\ZeMsI[>+yh\{M[HtV]qtekzmGLOImItEο9V2wuօ,wpt *驫K큋ۺ*n[?y]Ri -ii52.!,]~{؆_izԋ!%D)ʱCDA `8%G{o3-u Ã0EXf4Aמ1{@*SUH{HCH/γ(wU*陣;eC.Ur-iGw1li;7Dޘ4gnbW~e*nHN)^R9[Q ;MsJs)!isӚ8lwrOCD-o2Vb4b(pg)P|>x%353'DhߴBfR_8ű~fzdo1.2aƤ;9Y=wgAmf!ُ錘×`Z3b<.CAX3tg[3{6 1Ǵ{DAvrTWBMP^iM 1hf0ٛKAks_wU(<17kV/\"6yQD2Μg4/4CQD0H^oڐ霃d9K]T>/3ۚ1~E2fncAΘ3_qHsM_ 2Xrx9 џw吝mv ӰF"&umcYS{O$.q5'>v^ l6KJ~ܠҒUKEۢ}|3yP;dzڸk|q7xs\``y}| w/V(OcyűH NQa.5TsDٖ.Ytm/ˀ|o' kUV2ʵR"J^c$>}mD*Y$4TLSXUQDu~՟utI?ILj)hV>q]G62ǰB1u*I._1~5%Y`*LNJ㰈7<_T͐l$1PWMòj(6 Sa/4&A,x5l݊\g#XlʥA~x{t`ϓ9=pt.xvx{98;s [27KHH_LAi5<c?O";@#p(. ߛkdIѥp3iYi(;O`DDzB m]Fn] Ix}1  GmE@ѹohxw`H4,P$N<ƑU$BY-o^xX_4#nY$yO rX^??ƭO'HA `D{7s{|g6cT#`z1id hu'`qG(aEP 9<6^84#S B>8;B,"O@K$,0tSPL'iFx@.Ȧ=ܞ"( ! KS~SB,>{P@:H3Z= 5`Ngٶ^_c^{$$2Nހt;u{sXY݇KK#c!ټ1F_ޙژͭ~&kiy5262lYp_1iO\ՙNQ~ٖш8_'9)O{kh{]y3sPuOt6ܓYX\ .ݞ|@(Ȱܱ3+sm_aհ=3NdSD1'bt=vNih]Z3}4x#]NTهPi$Vޭy^o+=;Ӊ>95CwwaN_Cػw;aF߻?^CpJMTKf=(~O<9p$"SL,%rjϞSSSVyS8ո2.S)yQ yu!$u(UP߂jMS)Rd%/Drv;r Z<#w{Xxk6#5>ͯSX:"+ ŁłdJHrHY$ j uXxTc?#3#m΀# 2 2 nQL( SHP,Ò,28܄I~Af% P*R` mS=ׅv/ĕ Agaô "cmhi@W?y234߹+_Wfab2mBᕧyE)'y~f0O%ܚ#\݀RL9^;{&0Vsj|br̎̓rtH^.q5#-' 2Dr`3/E %hu5GaT,46Bg8 d>%5oਅ7W=pa)6U- b0l\-yFl{,.dׁcY&(G~=׬X_5_~6)YD91!|CPQE'ѢG"FO;0wk8;L8D6dGg䚑BzZߡM5{STu%jgV39ŽQ>&Jb6qU\0J(@HM)٤'l\o0xF4O>}:~;tL]'Aĝp`91kc>zbox|[Y .b3˵``7"U/Ax /D\Mq1Hnq1I$">25G]/1T8h֪..ZP\Vx[}ܫmN6X{b!;|cm (pڋ+dab=4L|J`a`{G?8Vu9=BXB>fy̶9Zb'^ß^1z#jue~]rLl,b)UMEmnq*iqUOm@b6 M(% 8&)_ᜇBYPlwBe{v!ސ]Ɣ1쑣VӸ؝Fv}=.oSw N+|OJmB5w'|构;j v ڝPqZ{j@&ށP{wBjINPgB uG'mBuw'}; v ۝Pqz{@&߁PwBI]Bm\`zԾ!yv3'UݵBQgԹ]<|2z0L\eVitN&]\e.34BJ1.\ c0^X&EI"!hj\V}{HgL!t`qwh!&Sz΅ca "LI˟.k$ /2!V2Ήżom1U<iP4䟇͓Prtƨڭ<~^q$+ǡ y25r:V%)AI~̒"f2mz»i@5'lxdW{!M0Na*n* l %e*_Re(Uj0p&QC8tHmmF Ckg~5f^ofp W+Y2=eO"_p9ݏ6Z5L_ط;~?l`AX~OE$-y4+R#&2t/(lvޣ/ADa.׌fO1ӯ?Ykoϛ__D^>zƫ_ǗVjA=_c[ǏlWأ>3o}Wb W߆~)Wgu OR>N=o8I==mZ 0Z[+KǮ3B҆O>i =]?!ڈNxٚqp^x 2 <ofQ{_f^{P~:Ě߅m#ۡmnx7ep&%ށ_LVK?4 4~*OHIo5˒!l&iW[7?-QZ_X%!Z|Urk^GR뤁\ޙ.P52p*W8Κo!Z@GL3?tqfJkkutqH:}lK0օRIǸ,k < Zab"Δs{Y`hP]5PThN/Vdk" ?ÜE 1Htuwvʂ x$7<%cR#ERg߆JȢuFKj4st>j <$8$:'Xe%քT"x*Tl"\1''Xf^aXь5xi@TʈiG?9K0!46LA=@cû =,^\G{0UsPN1E gد<%hɛ>|s29|I!_,D Ջ)zGh V˖wl!;L.F.fD =a00- s<Lt?_[G16 mE >8gC ׆hVNU4DSļFÝ.C)[`ͤ_7Pg51E/O4(:GB,XVJe #ĤrLL${잻h{vch#s N]T  )5@?փa>BA* b —ăDUktK' $\]l\,ՁJC̒%b(`796ӵht41 .L Q:~ןZ$Z52p in7Kb{E!QC3V ]*8- Bx:dV: ʱj-:YRk['zy킣#@ c > t=ط "{"{ ({LP gkb$1e]lA/T1pG F 5bBy2n'w?ɃA#sV:H֫Vp=F2 b" fq -.IhR \QFHV fJUi$O։?G؜(aQАVtU[S`rʹx" j#аcod)QD@RD+gGGD+)Hu*!rp{$L NvQ>CxϦZvdҡQ XdpC d7ҖDÑ" sFa4Ce\[znYi-ŕ:eimBh)z878")}.j "c.\f`)bFs .ZexM2c%Jb"kCLʋJ={ԳQttGx2g V B("s}|9M 0l hDXf?D}zŧnz̾ut)ٙRdᆲ_vCsx: OTx6ls*WAc^/DU4dWhF P$uT@Eof'W !p;>ܬ;(4;κ )c7|pvY7!l̆(m!Px¦Eɻ }BV6{NfI*b̶]hzo+pݰƌnk5`yiu~k̂qW-su0}ZE?gېn8ra/jt4 c?@Q haM'4E}:Ѥ=i^QщD`5'%$>8G4wNX=8946)ə.֘8f2ؾ[M{) {Q9Y !=V>XEX ZEWgb=[DRv=p^^Tu6@ɵbiðolMz|c$gy.~M)O_:lkyW+Ȏɳl\j01uF`båcgE/^r—J ƥ ֚s┸,q&d=Yw= bi4}肧!swizzcY;TZ18斲nEe_.5|Y'˼ggY3J 5:.'T '[MRi'꼨x8z}=ݢ\t Weȴ&s $\J ]{X] 6x`Pn|0R&c-F!ۖhA:Ewam;\|2E/茣iSQ>99zD2YU"ףBƅ=wAaB1cyF]Ŕ+Kێ$)m>=x1E[_%!!å a^aERX;eifVk""ёmt{R@ػGPa44[Ld Lj:s/k@/ Z)" Xsja%tZ\9>=U-e0jKX)9dF]I&9 1r4r'I'z`l~+/?`FoЗ M%+!B-<'S$=l-X>>?ϩ#̴T좲=[oz਄١l.𜑁R: N>ràĚڈS`Ig8~ީڣk[8.9Hyu5Rnwb#i/0=|YQd5{:4SF/ϙ/*&9oƇvѣ}#+:<9x@w)Ok@>v;wtvXn{1ZeZ82 dI~F~XBO?PvH._Hlr4$5#e3Uu=F9;>"&&6m+6L~]ps,a:J(ZMZ_).w9Jb\٫|xl'Z))U$?9ԳIfnoɽq-{d|^lN=kWog ʞX8s]퓃,=O=GG)ޞkcri0''{1&lX{vW%N=2&RhRL@(78&- 4.缴;QU1K?ЊI1g[+5zͶ Z'HEVbAyr cNPLٙwkib3O7G 5؆EF4w$~)afK_^ev='eq\r::^0~>(A<'y'N#\Ysdf>7_@@8[GT\)]$_Bt|Yi_r2z}XPظ7>-: wEb(}7rÙ]8l8*H/Zf\;jҦTV.!(r:d%DN d"S:qaNFw?ZsT[ 6{"f]g(VRis^i,q LDqi7n_ DNJMsh'X2qm87/gY()Q ftzѢ7wzvJ\Ǟ+..bհór:)mC,OF[RKQk|왷WoٙK{]ݫ7q/|g{ .*&,ƿ?nV=!VoOӣb*ܗ`Wk(q LyK|S=Wc3g?bv|t0C"+c4>Ee|}mlϾ#ؐ颴 ?A<"#r!Aj8Z 6ѫE+1W 2?OLD_cTu3@FX5Z+ނY=$G2(R7Ը1ZY宇rz^2qTio4yM_[~pm]:cQR5ǭSkA0Oϲ: *7v]r*yr tF򚰯p4JK\ nVN9=@Щ_NpvṊ@VKjޜVZ FQhOoO=m|Յ(XKA;;h gEi;Фqѐ}݈]3ojϺSo6XPiO2RwC#rף6'Pŷ?G *N7BY=EVG|~g!MyЃ;;7 ui27zL_/HלnY_jH?UD{U҃ i|YcBVߨF]lfaV΀`L`V"e %~IQXlr$Qfd3inqI 㫙rĪWM80yHcwU@J!z@z$BY`dPk@06 5 .) +/ b»=ilX~`xH~`ɎGAJ&a[.th&Z*!@fZJ[hΆd;4b8BK&/;TjvIr]ǜ 䱠parʤ4n*40v[ovuS42|PD@ j8QAI8px@ R}Rĩ\**G+h |/fٷbZT翎 l~ˉTXg!g)E6MeVN.߾}k36}A8AuD`-N28uNY|d]dסl㎟uX79(p猔(B{2e+2ȺVȖۅ7iQO ej{`\'$XҽRfIZa5ーl[mp9FY`kD$",8\PlrEܿeZ5 ?lKDveN'Uʶ MEX߮>CyOYU5 hҀR;-?#ǤVVSeCrg'cuÂukuYHq,\m$mJ]#=+ vrn 4XgSw|:R Ts6u9߿>_綠VZk-u { !c!,ɀQۧvN# IU!Z]fpہ֙D]3y|+œnG^' (C_4Ӥl\s9!%p߮hA?.{o+e$I]b0{ Oeyrojn~uQV/ٸσ|QΪ |O"/W L3aH :70ND"p`XQ@B+=*q\jbVoQu%o6!k-sQwBo 5/ҵyF2dd` 3QεSS΅BI Hk0\ h \XdtdtHѓKƚ% b̈\*0j-sA#e398qS9m,];$];]JH -] =[%I$&؂$bww"՗¼Ma^u_a x\Vlh:$gjk)GH8Ô ZN`AFJ;KB?Vn<Ϻviu_D8 h/~17~\ ǢR7`igx6b'#> XbWWF.E3 pB9rH=2@ܞ}Q^MEq3L`FM[};ڈ}7k371M%mW\KldWRce [nb@nv-,m_.qtϥouhuk@uUTrCD"xrK9 #gy)2=UDE8zGd\#U@YAyK|.͍](}51'^ ݯy1?ͥ$E.^X'ݤz<d UԣN-` ozv/b_M'oykwAz>>#z&S;Q7{vfMe\ٻ'qHSĀ);蚽٬TG@[؎XT*q?t/SnC&'{ ]VSԷ}#=mLMnWH<]q(/0i3N9pfEZ7hg?U&n*/[=gBPxJ.f a|ZL6+>ŝ섀,lСL !VHli8k!9QK>o| +~k&aVFCj2"Ko_Z~9nH!zyNǥ3j/NUⰓ2H9<˨WQM.ÿg?~Ht93t3{Qrc;FrhثS_IIlA*ra#nAc{ M :2J;M}Q#U#y1"5ͧR|J 2 ͧL,fT=1}ϼ2XbHciцQ6*V⼋xNX@#⥣p%c O^YP)Wbf7`kK٥FaA%ϱcTmqH`˥8*D=(PGkLs"$7 D (CЪ+_Ȑ ~ ~FvV|::bCG˜WǔG!GT."N@z$:e9vPa,7ٜ`C> :ٻ6,W;UAvu<`L`;00aqM)9IQCd=6Hl"_Wz"ڤ:J 29lB%SwN4Fđ2WFI+O4|17.˃sĂ,1%?3RˈHfMD:AD>CK{s,ə@< (&>h >QTF4pΌ[i9seU&#u-(o_g nZKߴ6YI($DZgY Yi988qޢ,j-ejAE&i*lYs$ poj`EܣW" X ЍٴlЦˆ3(?~֓1'&v.&<^[]_f׈ &~V4?v=D"f<#NX864D+-3S"x$Lb$,dmڬEOP9pVF6XK~aOt,s(W4X%֖[S,ڂV>cx`!nCk4kn8FJv0)Blp+R^)[tDs`#Uc-59 $J؃ehYϝ3P) `BǴ85fJ<Bց]"D+z-x pKBb`RJIxk!ZiFV|LIY{3\p$tQb nY&ʍGoô<~t6iXVs \wz3;g[}~T y4vۛ徱 2=T>%\곩>+=n{${\n[@eq{;mL,*z|U'/ u4=(?#RePehDɋJt7w7jqbMzz?a>fX_g!0qnݹFFٓʛ\RPq&:|Q6nоZDkM}]m`5$5Mgn_y9n&wWWɬ;>G?/ܲ.]u;A06=/WY <VXiX9 =<۴P21£#Fa`r%7,GQ&lQXL>a`5EG0cn0fо[D QZ|p:3K}tļgyy_9(Rv9G^`a&9=FQf8ܞKBY71etN.Yg6gg4{qd:}7%'ZShuuӮ7m;R~ OQAȋη<JW}>==3=)8Ώ cÊYϔ#'W~~}]/nTϞ[Wh!p[$|ng8r})2%΢4ץjsۛR[2nMF p`d6ɒ_(춟GY {S\u c ž =YtefDSd~bN*74̅s^`q Œ:l}wa4,3%s1>'qb@+Y _Bq#۽2=迃oƼnQVu:kJtu%f4#~P0Hsݘg9Nn\*8'ʍ:0trcmWAEw˪mʕUV:>Q5Su8R(ǫB жr7Y4\tؕ'@;QqR\`g:ʸ[igş㷙:L*rpa] TI[9c];1g߾e gg,/0|~ex_ ;c&+7/'w7MYAfίl~t=݁Ix]-Pt6]xָU]`8pka-_ԔSyռbjŇm+SNƕ4<+iʔZ(|qQ>!H[MqڟFYj3>7ul n!vݳO ^<9yR|Xm ̒uz3^NH[aѼܡʔ58W8eIYI%I{$v?OGyޡdqٗ)y+ ;ƾ?eݦ_\tzѼT͉te \'#[xр 58bEJx&1Ͼ [\hpEϹ&kJ:4E#BC++H:@T#WZZG}U[ˈM-j͟O ΅9SWܱm`WO^EkZK@CL^1 0gOqpcЊ@^ԢTzV=.52FrlMA(bVn/ ~w{]v/톞S):( l1ԈzROraWX_:oK&R@A&3.c\)b-\Ԥ6Do9C:k ΞyPcLj5&&jH.[^`EO8Ge-/-TiI4\~}5u!{ P*)KAKdtZic[7cόڙ H퍷R 7u7xsϩM摜 H%qsl0 i#AaHrAa=s7|Dc`$7|d?-򣨰mˎq *mˎy[rf]v˭cn];rۼe4?pݭg+=. 5{vb(` w]ބNR8ΏD _Dk#Q ";5 -Zjʨj{c; p=Qk'~6VZ %Z;.2) VNGj*n[9ը p;mmY!ŃXr՘x)e-qZS(aNT7;!MF=@~~"t%(i*MD4!Lj0gy$D%+g~ًÝqQi>s&e=;󊳇,:4usąI\"qqcp>:5E8.TqYc_s _#\Nܓ=xM&m'H#{ =qQm]y4?ԇt a}8bNTyZyރ{{u_yHXyXlq6:nY BM)Héf %:d[ǜjiCZfuBZLrk#g[ǹH,Г|Qgn+Ivף+ '*Й&U6xf`7`@µa;7s9vRp=]g)wX[W}#Yn("{ 6\KYmmۖcX~jgpu_lZ~xCCNͮhpy.8Р -]g7]{[ڊ5d>aרEڤF'ۺ9z63:sI9Y+c`** \Pe~HVAJR1zN Tm "of7;2),}=nEZC˴M2pc chJ)uG EKԳ8G,ВOY)Uvۑ xӏ 8kAbbOpӤ (~cdJ6"\#c &j5Α,R5dM Ιu4rm3:g"aYl*xO2WBwtRLN$Q`c ;%PkpzgzIc,?ġKn.mpðQS&v:i\\m ,?]Ks#Gr+Dt=8 OpLr;v @j /2LolӆGNHM-ؕ4GMH%67^WF]R7!؟b7 ͎S6JF b%WZ꿧6J ʫZ6fȉI9ҡ׷ XO4\oqĔ:({T9ALjۃ"'ޯH)seʔ:y\L:+ɿ#u(je2xk.o~eyqc)hf1ÄKVDsPd ]]+//??^Q33˔^*w<'f?o]@dH:8Λ// G x!ެ6dDE s&'%c'x-`BȺҒc?I$,9Fk]bN9l7ex˱8\{F~vx;vv^`$j $vEW:\1Ӻab;iPV*?foybDUִ]ZeVYlh.)dO?EDw楟^BSN~M̍c"[TGMⅉ-pۂMm1~o?ZV?V}ǃXh{C?<+5g{.ɵs:7uJ,JFX!jYCi&C99]Pg>\]~ c}}NӼ+#>i+OAxVYr Oa%I$KopG;g)<'֌{[Vox=Vݘo߼5}'c}%Ow#\G|`lvBy0LqDѪf܂}'nv箹z~/~6x\#ǜ|} /cٕLiq bYMi;M0q Ɲ&DRXNǴ&X,nNǴ`L, EYLǔ`5r /~@< k 1+[iqoy\ݭxmܧ3;z5I̽_Mb '~Z2 Xߴm釋էOri}L=+|p\Y/~E}noH ebBJb-YlGtI"> ˄r.v}oHiMgG `$H2;7Hz ʐ ȍoGA$[d8 cF8ڟk!9ΐ1~i0* 4 #67m"U>/(mS=ʇcg4s.V>lwQ5ws`Q P-TE8BD[1J0'QQEUFFU>}T{W㔺 (#yzA*2m%`ޢzOS )m>LKF2J-$UoX”LF0b;=4먌\&ֵp޶+]O^`Y(MH +[C"Ԯ} ҋÌn> vgϟ3v῾Tbqur/6/•;Eq 3횖*$&Vc4܁Z5B lCTA>.ӻL.l?Е(ǭn׶ 7N2K5(*sQ/dMƌC{n#w\z9sB~izl4;Vj.I,i4֦:$M$慰vr-4Y(FdCa!=i:sv u(xЮzϒ^Si]Lc[%OܷWMO^s.#pGe|` Q 9#Bxi1 ] Qɇ˃87_7uWW~ X]gJ+OUV;~c`+Bk˥.%cV̭۶ґvbד :0ozǀ=tA1JvU- 2z\wVZCTd 0H\g!ƍdCW`] _ ?.[ŮKGFmRg0)# C϶gZ%>_|6ïCbH>D!Ar55zp`lCeM 8,1 bSI5 1<TR#S#fO-L"ޢltnv9&0ĸ|"-?2kuk ֚͝ȫ:$55TFvtzQWF62.y68u^YՔ:#EJ#emd@!a2%L pzы7F4¸|*\F^Y6r aZ7ACT81n$H׋D2[ 1(l+$yH6@g\ӬMIӜ79 v~\xpQ1J%)6aD$#Li&IaD,ŽM4LZb?*7 pɫ\bMP b.ᅤzӚkxxoe5p8~)/+hLnUc-wű*%Swm}U꺶걵\Պ^C^K%Ɇ4%aM;˽y몽9ūN&5xl9C$ɇ_K՗5s  C6nmI/*~/痻 VȞgH;$*f^LۍɅEMv 4fKmaQs 2 >!^'띘մq -Jvpy{ w(4 ѨYSfH>|ⱔ.m6m~lli; 0$܊讘.4n# C,nglßqU8R mLK)4n$"ol\62zBF[x$DC zΈsId4j$2rnj=;ʶ,~*]kuw_3ƓY JJ09WB!Ygڎ;}Hr%{'2ƱW0>qEʍwI29m4u%(42my\9T%&ٶu|Xr@ s !z&UxɆ1qRR1CVN-) Fƍs1lۑm9)=_V|1NbOZ!x6_Ӗf(pS0`yO )m6aH>cM;hjS"{@9$#ِ\T:UTUU1uqlsAɇ Baء*\)#}S+Ƣ0L|XQmJyklHK{_TU U3j߻Oq%6|wv{솔wM,_d?$nHLYiU0ZlYU=wSV*(e4{i SsYB@qI)M0QSb(Yt(ECcoDDdOdC1)DNCWq!ٓ??/OۛA|($5UksYoM]z,xes[9k@&A1n 7#p XƲywj~orhz˺9BЋqVDH0Qc~[# \2{#  - !N=-Ȇ&g؊$,kq!fħ0'` 0g <䉽rfRfP5ȄRz`$ LMmo\AެyXtWbx"X+IFU9spNIwdke1xl oU;zvlZU*@4?z`27[Nl,aTz@5vQK ?xs*ɇPzwBB(瀰##u>7Q_KcX0LFMi~3w1֛. 6:BcSu.bI,y=CF`g "\ z>hiڦ*鿝N,ߙdwm#I_!a`K~{l fw6 hX,yE)o0U7)zYdEY]Wj"5*AU1ݍ\Is, 6icxkҳļ} 3c\ %B# lbdDD$eDeԈ8&)VReTBD"鳳|PCNP Wv6r91y낊cƦ]V؆+R lQvXEVU1 \Acl)B64\Wߨ[ʱAnr Zr*^AXĈ=3-߀ ]J% vq1B/w/Q+Eh ) 2| C0-4][!d \)s_.TbDY6reS[ OE J>ywcȔ:aq6+jcdo8n{'Rg]R}ͽ "\&S(FzO,,ʥA?vx7h60_%71G^dr]I Dt0{\voLntݭ=J%_LߙӆqͫAYCt?bkkm^==ʲ&L+, Dj К^ÌL暼LGXͭ;Ec"oljܒaWψǁ>Rƨݕ91;c2Ņrc}HmҸY>\_}n6[Lx^ݓ_O° B|'w031aWa`H3[>cAepSwl׎k~|A  U`RRAN̮ʋ䫌#Җ^Ak%RZjh}$PQT3.7sj b{irV_QOƷ-al cK[¸7af&&HEZv9cB>d@K+[ZyZji= DG԰ZɅxs˳39R&"ci眗 mvNCZ{^{^ ZZp:pBv9"0G8==owڰMkN4s 8LN*f6+mqgA1q$嫫 0}\2 S61ML~i+%DψMX)>)0iYlb[۲ؓf; ɕZal,e-mYlbX)8GgXI]@/*Ne𣀖3"-~y i=fgvY?C#?F1REPnb*( z~%tb1\ ݚE#G)\Fa艤D"& 8N0JkB RLp(nl3ӟ3%1K;vlM90O5gb"Z%D\ !i=E)KئLLbqgwbV !*%Z|ZH5ъB(Hȝun[iEz[,_x 0 e ً"`.-v"mHX(RZnJ5԰]=R$E|M J݃ ( ${9wCh@àvhHҠoqmq:8L?|b{Q"O<~%Gꗷ?OAzgY'`O*GABnk(TlfѺ0 s֘J^,C^}ȩ!k] |(L;4*\ 3%v]6_Ld,-bJ#l8q掵(F}!?o*8L4MCٻ G~LU;HRׇH\Px pa,|wv.~߾ SmL4e42RhXƱLZ|UjRQjqU"J{G'3 w! ҽA H7aw(ɺ^]wK:C:-wq@au4y7+&LZdٔ 녻^A Sl׾ehC*WZQDFBL-f"b"ьD fzfA $i" 1 `,B'>fc92=8pi pyVQõZcDD- 1:b e Ze5o2o֤zR*?|?ٱ-[:=WA3ӳ_hZ^ndbOO|3_Gi0ކL/\7? }gfb[%oT:JӬ2tN )141M8D t9z|_J$| 0@# n7|ݙ "ry`~ n4wkqsiՋ9w0MXn3+-api8OMB֟m6Lrթ(e>i>o ݄l Ng O2(PTSGL!Hlᨦ᭳:*ΗO0QJ$ޢ(͗a#]\}~-(WfL,IuQX""pjerT(JYEd)I? AC'V%Lm 5CUn?߶v q)#; ˙cG0*{Gc:@PciuĤY`kt>KRwIFJL9VqEKJ@Sw$#r~x~r/DtŦ:QP {Ӻkygs۹y0?HX=Ͷxu]mZ PZ DbJbcklFEѱ22!¥ fd,I>X))j_yLco5U^r 1ŕ`Ό| wga:2Y&t.܁r~1$U8 a!^ȥ%"1dEP(" 2+$6zR(4PLZGLpͿ)VK`Y1i=qeoU8o[ÕG//TKpKO@&Y̩<ksrz.r^Z֮ P(V6ɊsF#EYC]=e{UYS]Z0sR[b\UԘr+B+BHWCD̏ά+?Ц&0r3udL=ugj+qnuhъ/uH@鈨01@cVcKNaWY"j4aJ(#DCk5!#(u%;JIp"}#XF@J'Z/XOϧqhY5ָa<%.Iui:rѾ+)DJ5XuTH7[O`̚ Y3.3s;ؽ/pXULN<ߣm5+&b'+&\`ѡJRvCe}CWt.}RP}Zb Gu[kxC6`=i17TjX!~*5F3mV]Ƽ݂1w km$8ůZspYz l@-i1+m4o֤kkO?Wê#0rc ) Rʗډu0&0v]^/كLh1_JX鐅M6euv?q?~.J]n!Qw2 W"faؿ0vW#.uT9ehVc5&OʆI7HZO`4)j 9R1!ak&iYBd63xƑK&F"|M4.kvTrca6OH ˧? ["8LZ8"aĝqo+]Kh%kA9`+[$gԸhoAajn-1q\G`QN=n!W4TOZV\uDD V?|0eh>8ݚq"LF.8@5oe]f[~!39l7e4*+n{Rf쌀lǿw\iM~u( E66ઝOη| 'yy\0&|0^Wj`Hqdĝy׀)kD1u&Xo{Al4?J*,4* Uݡ n|ѯ^SM{ŕw~f@ͮCb>hW?uJvrJNva un'<㹨.0'ni%Oz_u0+큹}Kq# ̹7{ĝ2PMr3~7ɟWwAg=`n/~1㰋N\s-tnljF)q~ q~g{'[N*EܐUB - Vz/D6{KF %J56xm\*$Ux,vfr&aU|1lTS*!qFfn|Q\,` `֑J7bP=V |.%醺ݷUFEz*ϠVeחyiNjPt@>)p-S nN3"?k(GNX<߷ $WPIBD:7q,XL-e5tƅĕjthx6`@H"50*ɘeXZ0҅iq}BH\d"(H$`fP܆m@}ŏ UgeSUߨ }ԀXż 5Yrf貚A;> h+ݟw:KZZ}%0+xm2=RMbVmB| Zˇ$ 7c>5۠k@)@2"<ѳV @ns1B2vܱ ,P:鵈*$f_XM3ڈ`qRQCj[h@ nP(YCN~,A̚$@fJd A D,75 adV&tSQA>&qX519l|j%[?H:9MTh(;B r)ܴފ{)@Exj`A-*w+(aQ40 JP_HH*T0dK-6ѥi^n[j Y<o^mu]DK@@ԭ7P7۬4GEs0ೣɢZmZQd0%Bi;}fn؀gEI z XkdK Ce0CkqŹaU.j1)R ; qP-A>bL`-T߼Ua۳Y|-7`E]1("Ns:Pfj@᪛r-%E^Q^p[0pL 1]ehĩb$dnabr`-0 ᧀPF% >U48'y[ 1yG1si֟@ZhYXAZhS3ud2r!eB5gr~BYGg]V`?jl KhNQHJ)#< VP0+E9#HL-g@hB2\Hm&pJ$nÄ'5>T2'YQ$APR;ր@$\( z+bݬ1KLRr :&f lҚ3EC(N.H,(`J-*\3伭]Gt!w\yd8 cV7כV0ߗAq_=NUQ@tsPuwm?xsQ"#K"ڪ;)믿^m[sp r{\dgg}Gjz^>z<ӭiulA{dxjOz\\~JG ͯg{VyUr |oXkCU}QzIFu7uTxP9:!aFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFkԁe$[QpdTM:@i0:P5`è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:/ըSd`}K7t,K1twJ(a-uǧ0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 ?ͨw'woouy گq~Vr7ς(ܭ_cQuݧy[W~kӾr nlw6} vrп|<^_~Vۿ}^aXwWGa]}J7uMc1 >+/zطc1  _g'U9wI3(H=\/{v/ctO1ƨ;Zb +A`0" Վ?f!`cc  &-5b"^RBJZw16 `"  %,,x kD3L֊ZXpRSPv)`Ru_N?_/NmvGl\~*>/?nƮ8ic _澊Qz!w{u'XV1^a2/,8̖[X,&8vBzKYaF6paOy!`ujU%KlYYWSl[v)#+۠6Xa' VP~?Xm)ul?e+,%fime!`+ !ZzGq☉6=j 'p-vЂq2Rv!`٫`z͢y\nÞUCi`gI,?:vSY8iޯ9zvbQTb`iB'!|jv&JZ]TTj+>Pi/ni;v yauAOGV^鵫gw`bۚr>=\m;ݼ]2^ݟZ{LH݁Xn_ae15N}D[2U2SQ"d?}qoocɨT&le):xkAHI?w@Bjbbު>{Xn!`(B:AuQ x`lbw0{'V3e!`{S ԏYJC,HŤ./>4˚:l2{5'*h YX a!`M0Ax!`)f88{[JxL/ &ɥ `<褷sqKkf KJ &0?l?b `}gWzϾ]DŸ;ixN|㛅 Dd_bq}n7_G-fm^fA(ܩ~Cɰ|AcTT0/H>|x`>0,e·E͟w>?ql<4yn qz7G!hXHdPw.Ϥ7&I.67<5&ć6_oљuV*woVi7{Bdr\:*KxuA37KI>kJH^c{SRjR1עyU\޵d׿B,zz Ef,i3HE3ϹMǦDm6%n{Nwt7ӓitԴͥMU>Hm I$ Ւa@vĝ1kPk,T2&#H*.פnEƖb1dAZ>VJS9nTb5crI#!먨fj.:I(*)V!ZX-pKA(`1chl{ SV4t7LYTZN1 늳 h`&z,ڢ,+ܹ%)!yõ[ Ӹ%E#4)hR!TT2Ɲ@X\#>劉ƪ2:ּk94Y\5Y`qxx޺E" <2B }dҾ72% |Req!Y(OȒ!Ec}JsSjm8>o yUeDM*1ڜ1yNTel na#Hr-;sFk(9nGi-B2VM`rke_GRX:znS(6e F0j#!aSFVCIqDDnB"$dJܵ*-l*$QXOJ=j]#ȓ(.z1D v%d)rcȘ%gEƃͬNq;o˦PB@*!+ե$P2L%Dt ?qpeoB7rOnEE{+S:67n0`)b&[x(eIע1IıD6V:|"@YG̸Eu0X#C VʤCD8[+Tho*>Cb8%Nӡ({sp!^FmWI8$(Eydo(_uU*S &b?܆]kD=HI歿Dbɶ R)V*Bb|)QNKRDzoe+nH X;29xZoVMx@gzLz(&ĥ("=c2Dfj$%r7I &">(9I=CӶdz|gt˃B&v6d-$nm_e "Ձ#/‘ ` dỒ@ɁhKQ#en.h`fS yE 0X hT qa;k()CdBUyn]0\a0GS!^|D&H.: _R˨ՙ76T n*^z,TɂNU~T}%*y*֝C@Vɒ3AL)`U1~v?>fΣN*q>e[̼JX}|6Fx2= Rw$6Bjz1΁6D. -B @9Bi5|1 lmm.F(ZN:G!;M.Az b l+wIfTh" V A+]KdyWN IoʀGdíhB茚$@fJ$)ʡ D!o,Y 4\-J>CQiI6cB8*gD(!cȡhl }=A|tԑpV4Q8FJ"5@[6iTJPQRpnS)ߨr$S5y+6P%Cb60ڣ

fs~qq! ?x[vkM/x'u=ŨČ:t:d:|v ZFШvN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7tN7ɨlLɨz 2"LŨS۶N7%m7|Fe= QuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQue|_dJF5c<Zun:QuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQu;ߡOg?.y'(_}f\kYors\-ãpv˺o^0^Y].|~ ;0›|IN,$`HDH+QǃU&VsJL,;M;\SY`D:#JJ6#kk C$DJ['VcT@kh*`Y۩Ĭ1f"`㥦uf*a)wϋA<#U[Ofw?ٿU+\gE/mb6UoLȐ`-NB*7Jsf qiU_[Eۡx3=XHI .?V׳?~ݳSg@\<⃯$~{zazg"Ҽz=an|! ynM1e}nۗ۸{Z&/kSr,zs_Ϸ,Le{ _T~ϹbOWnC0!qY! 1ųȃۇpϑ'na"jYh椋ٻi/ aяVqArF_jB69Kw0[^\_oz??-'BsuS vbisV[Ɋȝriٹݡ\tuir Md}~x( w1gdNiTUsҢyK<{lvZJsU=٧`I'A)5xy-B*qyE'$G*O _R{~#Gxbه7;=Oo x˨˟+h>V|nnPjo>HUibX"SwJυpZm3&>נGR_ ]?/OW'7^?;>vGyܝyw:YkI;no":ޟz(Z&k/>|߭/n@<.v;<~UrLR}˻{yK|`\Cr? ~˻ٞ zRѝQc(zfy5].^@ڽCvǎp]Oԇk9۟0|Xv`No'Xr>@nCm\84x;WeP\<$7r\}SyDZn>_ %DͣS7gmcwp6.^ vYCk?_^;obswikkn6EڇuUy-?lRȮ׍FKv>NYֻu_ݭ_OjCW5sU/# Nߧ4nY/oT76ɪԼb!~ժnB _ШynuAe[!`B5q8",̉0Q3'28Ѷ/Y\'0)]rf=g$9ggsDsV<~1ӰOmǴ"4:iMdr/K} hlM3.箫mu~lR(^UZ`%rXҎ<`޼ˏ̚` ѲTx;qFRrEdƮ$qV%"]I+ǩqo2iqA@ ĻAn)S!Icж@p '\]P%Ab 083O+/HNM{6ɴlf.kT WQpσ!9ʚ ב490p:Gw!GA5e5S^/SzpR*Nr&!(¿ck!#xa=fEM^+e"c,dzȠ(hJ|Pvx.|bxg.{AxW{\WN bT20F!WW^P=Ļj**ofLzۗ5X8AaC&D{jG"e9gZ\Jʐ*B#yZK LQSd(Oj>xWplUZ䵱>!%DB7;1\iMIKC訳4xWȵ\A)|h!RqITfG.?=W`/n( 6\X5;{ʸ“@'RLLi9N+G>rfgTj;7xoIw4WfDJ1_wYdLX -UǩNƊ4⡺/ fcS`IYf"o! 9PMSyb%Z_ ̷]ܶY^1|7kcIr1R,a3'"8i+Ν9R|HXB<[OXBwŷ5`t1ߔwiqDKh &8*RH3Ƹdt*8Nx(?p8PcUh6A~Ũ/V/,q4m 1?.V7x7HtN˭͟ڶ˽1'?f揬{Mo.c7 ֩ .kvy{hq~6i 68 6s:rO!xEw6$ _r9#>>Djy"G}Q uϜ ʈ Q؇!!U=@x-J+bz~Q˖$˙ )HGL8Qiq#t(񮐛XdN7ϑtJvŪS1j##wlE1ȕ@m}6c=fo+JYF^;)8N`gif\HMWs5\q]qlӷGbh7JxHz|{FW}:R3-j츲U6X8ϞHI ySmE)l_xG|9cIeA@;Nea3p{k6#>:h"MiV-?-6~VǤ*'zL@^M.UW}S5soW-T #tb\YԌe^r&cJ7% Ms%Zc_/ea(҄:}ZOG;I^ϖآj2\sRG5Dd=f-q|WE8 r㦡_FXR,%K3yaH8%u%r49>@9|z{QuUTG\|)w nlf#I~PCo|Nxs-'yRa`cQA1Sgn4ܘ]% T?uOhWK`iI $>0 𸉴GWes#R-]'2w{uhv弾X)}``ĥ뱖"Zd D`%y.I`}r_MޕN:oor85-olk3^`O_X>vΙ1 cq)ww@,t~'qV?si jFdR,t(0K6dy agp=!h9ޓ-%Us㾩QvCn')c`c%o+l<ډ?<<xJG>Pl X{,''ˌtp>_ ߡQF!M#^O;'mTu&D!y0K{b0ra_ ŀ7-mG)KaQ@7.+. ~"'mg1|Z=#uHFT6'[X@sAFP@aĔ *FT> <(6>wE1( HSvSV-rʂdu>|*&y>#Ļ W /Qi}W* _r T^dZn-޵(ޡEQݾ+pVߢ'm 56b]ft݇C 75]OTmôRa5"ޡQ}WZ~(x؄?j =Vk!f;Qo詑S_毻&;4Jc-tʗn!#ϯ䂞҃ӯT2ʇk DVzW Fۢxj("8q jq5EVz{_UZs>*Pvl NE'Q1Vnt+dh>U[]jOcЬ(ƺ2|*i4Zܮ44{i:%k;CۡK ?ԴCFMѩ$\$Q)7Y6,RJk*||f"n?2N15|;g#E3. n_0p75h<.qoO&B&FucUЮKGt2L[ePE7vE6C(/E _UH$ a|Y;Q6d*:u?|~]]}6?a`2-ȔvVoxoϭ;MJ?MuZƴ,]>NJ c#Hǿ&?2t?Bb8~:bFa,<*sutg矛/>ASZ/Lp50˾W1|~J?F<'qgŇf,*vịlj>;ٟj=/IA2]# C$ރ,u\Ø"R)sAQ/ VL]n_O?TS",|mj8ks(>ezl1OWeP??f'88?=dPN8=|<j;䵦g~'gEͯ9*uï踹-ÅV'Y p6djl>h6/^ o9ɕvverjGpY?g\._\O_?LW.ٰTB}h 8IC6HfWx@LUغs7KV ~.>NwpwۇT޿^m8ʧ8 V -V;FW Q|Rdc8A-҆f.z;jnvNq$nz,κt< έi<:-}gFBED"Ý$S{C[4B[C{\YlӯS;@ mljkMzM`t5"ޘOFPч: P; 65o+-1y٘rǝG=xW8}ʃNq`آD nl;۞5Fi,br& OAwsrv0=0r4աyT`y6( 4F<EH߱׻ٵ5@8;Z/Y5LnEw z ޑъw{O:@A'o1^r)1ت&nfdg׾fÞ;fkPf|7-rP7 4^XQc9#5&gPL:cPp v\y1p^R&#L<;ƿQ8q@xG* Zu1>I/~z1+T+1_K簼$O67΄M ytܵ&ʵYVXGR.}њBL}MNɮ -AniT*Q]xUvefԽe.* K2=^K;| GmY|U4 KW]W 06 co!N%\~A <ٴċxw7InG f!/IKj3E&)h`TSta(cH wZ oDAdh5]7:z.Sm_+xDIM9QY:KFCs8aD40G&}͂O0Y&6d)4 p:m" BT&׿~?VoSxܰ"" LǦemaT_XVĊ+L>N^V\VCYkti ||).'Sܗ^wy+Rtm6] @#! ]uЁ.CQ2KJaSuCҜp!cT,"gGz/;-hh3`3>:RƂtQpe~ryt ޕx2O.m̸l} 54* S*Wk%oL# 4YF1 FaY30H oux~ZFע CEۈN͘wQ3Obqޕ vi? 4Ƭ5x!,hYqzAѸ hD tPaRQ:qcQ+"pQ]Ygd:ݪ4rs#cl8mM>WѾ$ǙTʜ&@4U2:4 z +p'ߨ2d&4qez8 n[\f8Ӛ.A 6J@\HhTɧSxW5J광=)Rj eH&5ƈT-bT^#D8.ܥm>ݴ)EjIn!Eȭ]yҨ44G]Co-o`Lc62ޟ΀Z6`,YyX^ˆ6NM :etj]U&tT];`|QGF\aE3|2' 5ƈBY!PtߵqϜD V,^hHʩA;5(,=m?L|{0vƟ=L%5f-.83'ePE5=(uƤ|'8+!Qgx̖ lsiru'RޔuCoĐvZȇT; L3>\ȓ[\N/?erW(KocxMhY3)Nvi+xm{mKm|0YDc$2Ԓ\\ k8Q: 8#s/%7Nw'8 rKz7?%XHECcAwD15 dfKrZ~!O˯' *=^!M(],zYߢ#I)s.ׅ? 1ppӒdxxOZ2_FěT-N l GVKWB+IAbV7604_^9=}@&sȫۉ[D`*ͺ$w9'si.,#~tQ|Wەo\$3 O$s=>OE+dYOe94zS6QjslӋ!ߤ,,-QMob(cZCeph[Pei򲳩f(ֺ"x.& 4Ns7$; -:nRR}GKjS[?P[ʷ$sxrqc:|JCf֐n0_KiF?4'x׍wy"Rh*/4mE`?H3 giQH|в3az`03'{yg!( 4e;`Υ_PcO6# kh9w.W`/wHg|` N*90cJa{8W0YFN/R]]ϗ"KYO x_ʉ&E{s'&4zfY ]g\'1:LrB-zeWW%wM UMh{}=F@h/s2]"O/Pmii~! sP?+;2;P&ug ̧ڋHHc(ʥE@gG1:]/^8g<Ӫٴ}XV#3>pg*c:cxiCڢ yTF!r/ĽZM5CFC$ J|(&[H\Lѷk`1V ض< =v$ӯD+\[$rx(Q8 s‚,Ңx ТK,- :zuj =~2#_ Dӄ2Nu8<05(߭dCM֞"*bhTҹD<UQ*TAٴ ~߾]IZ|ݶ?+w^0=ĔP'aifm A0mYͰLH_Wh= C >5*QkLN6D51`*WRZ`nO+C?roU0(ag:s>AEOF|5XV'`%CE{^F?@rY0O(Q؅_?v-ۺ h6H9D`y%ӆXW&6RMM *[Ǔ+7|t[4F!Y@j>73U;#~6[<̒z:,f{("Y$2k7\gAfFm/3,eC m Re/,q`蟶:vL>DO&ݡDobc!Θ2<#@EDP+5ɤh dDZL.V!o g`]> _'2 IMrA /7. 30CP,,MB <Ԑ5dC K>#-C,X;>CC7#jdP1U?a{ql{hta?E8st~ 4?,w=%Me}}4`/tczaTP[@y-#wU-hICE|BP\Ujw\ǃ%<;R{^08;#.L cvMp肸M@oo>'1{8a09w,SR53 gIOIg$ ,߶df׻rwpꡇ2%L  ՟=e*k\^q@A !d;V >)X{]Flpjsw\zh[("yJ 6 ޙnp{A hOFjhn/NՄsdoQ0=%8*Xk+b\WkYDSVWg#W# o]3 |p~4 |DՔ48": 24PI3i=po H;2ʈCT;{UE w3:^U0(QI/R}`M(ZCf]kniS&2Z2yy M5# =y1=XAigLYN4&>%{7w//"X JNzّag%:LtF ̱m/wn|^ൟ",=QHin`5#A2G5ROͯz0GFs hOF!^TógG𚚕rI.mt{W}~rMs@_5=i|)yN &C-o%T0K1J8.l:@nhᒃ** o߈1/3d8źZ<*6DyuM1wNTT ryiGGl={gXk}JI{ma܈0-ЍNjX}gvw(q)u 8CFDhߨE˺Jg aF u>"8i+]-8CqN1 eOF!*z9WUM-AĨr0_bFJ|0k^yb+oG| ӌ" )ORKlGϭkbn)'Y_#`pWGF D<U1kTVLۊ.v9C;s.Ƌ0=E8i6 r W87B[#;P͉İPSJ\bSu ($[F`&rn )OrsQu):4~qE'rZ&ih0M8PA+ BYƤ?~rwN?8KɿN]y(]%cfw]^?sV: : ɫ7z$brqņo+NoYDiun+|-7M^J>v;%Xrkٕ`|PÅ>`1>_r3n7]3=F?.YTX A[P` aek,bf;vKR[ ћ'Dpq3TB6:.*OSfZUMU@>JBZryɏ& t[7q8MF!"]vU5UCs yC,U2U׉{j%%NLq4OI2ߵ}>̻-8l=W7RsEXik" .MFu {J=oCfOFSS'dŹV_=4z~2pXQUr-Y%܄!iDDs}|>0=%8(EtpiMH-w45>_[4{KGFB(QaGy*=2djcQR1Kq:$c007[W%q i;=29w@=>f!fqʸ3vW_p*7 i4*IM:5 $ mҿAW4;GCpz˜dK"gsa#AG 2?fsv?і+~ěm+f/2VeTԹc$n|e\0 4D 3JLg`A^q-Gsro'C/~3bS nr uRP1W^s2QMBS 1\iM׵I'"v[`FAo !m}!V2d6 'Z4`ta҂t-ַ'zjmw"4?=6|QpGtWdCᶁstכ͔5X |R;c7&<+D Az 2Dz`}OF=*D)E. ٴDž&JWucԼQv.Y.pK 6CR}a[}/cE +.yJ'-4FۄB3i?%Gb :mDF-I!L֑jOOЏ[s ڑQq.Hn/NԿO`v?E8$bՄ@T%cN>s5<3r&k./ iAlPQQZd庪L\JHGD,=cF JdOF Up"Njꓩ!!U41$cIw=|A=7&@]k:2pT(ЙakqհM@gE< [8im*7k(B^orubQ#9P9i-!e˩i8meQyeb#9 t%׳OۘE= 6qTkv᲻HX u}K!Kድ.m\O9~|6E-ՅK'}lyy3w PޚnLO/ב1QkޘiUI;tHI,A}n$q\z?3~3jZj8cߞKwMDy,\=d[NӤE&cjƑ_aMj6_\Mfov';V͜ $A[c[Hdj5(ɢ-)ICxht7`2x3/KdJ'j4Y^j]'.Mɫ7߯ }]-Ƕʏ78]Qq|7&󻮮v^ϣ?LCmL^p:a W&\ PJ#)YjKS&ZIfeBM tC4a's xGu- ]}G]mOBT`LTcm%04suhjWsS ­WV6V>/ŭ>`"9nsA{NOtRYꗺr{ok7Í9k%~a)ncp7g:+l9Vݞw@1*у?jD0A_GVvj2umU_P#>΋#L)|s? Lpp~\tpZJ(p~ztpxpv]6̍ͽ*A/F隮` +Zg Cs~p4 \ YrL* k5t;Է K-F _--n$ՂiëBlDL6fRj#r,--6FMرy6DhR-ٿQͯdv~NV! Z 3 \QȘѲXj=. "[ܜsWپa:"$Gߏ!͍FƁ&bKbt@xxEKyprJt@"u]n^0#:8*<8uZ s"_tpa6caₘE WW/mʃ#!;E-vΥPӸ#28M6-Q+~!dm@8FI8IDG^tl:x@%I]Mֹttp]j{JpX: /Aǃ*EkFGGgS֓oB=\Ө+!hh̙MW?lΆyd}^qFx{.x a!2$T ke6'3` }(:\jrbjfȭx'Dc8گ` r+RъpZTō?+C }@WђqTf%:-K 8s>.Z5ƌ/]UZZS]F,fsg&gE>l06敡5a@s4/)T)RJ`Cr Tsj+B p&`gI5f# Ÿi̻jN0Q%f@QW{]9˕1%9NYш cږ95g 6AI3m3%MfTQn>CUEE,{hDGE⬔62ДL  `CGh΍6!ύa\-\*@.A In4ux=4?}MG즍P@Ј QtzS^ز(*Ņ9_:HL3Z郆##~ݶft=4"UX`-Jn"KҰER)2n,ۜoǕhDG~ջۋxx!8 :hg/Ra(Os !uA`BАڙ k XzU'FGD Z E%2V>M'`r^I &erf^{hDGU|F`WLHxh?.( zm.6)r>'-cЈ NPk*Q@3_b<24O;+^y{^" Tr"#)ush處@ddh+e\-FYi pHEnK K^PR1S] &4qź}{hFƏ5 }Rb2˫>w$%'mycFF@mYg W"4Yd3%8Rܯ ;/!:XXy"̔j_O{V@( P3{܍Tf5.6 N&+(6(2Kgu&BkPA^:|LV`wa1>,+cǼm7zU)zC#28|BHnVuAhFƑ/U>?mWۿFdp t$wh,׃Rw~ZyT)5J )PpES*8 Ga+ eMl*!-^;Zh nH8 :T (uXL jdB שT4s7D: RS0t ^ =CUex6 T@5T@7iްmW~d\AjUzqƐ[* b^'.f+Tdؘ&W$9ZuuVfSS-`+6v9w寲OEb'0(o{OT i{9,7yX& қ6#VERW]=4"C*" "{eM! !ײ^ &38  U=o?6.s߳g-ؼRЈ NsT ړU 6ZѢZ 2H]^ɧL BB5nu?թ#v8z $殒Rbo_rsfkhF/]GW^77e&j:ЫúoNj]_mq޾VɅ*;{&[J"U5+3;:-h9LoraKnͼʬލֽիfЩtoG+$ᙓfLw]]G%%?> #aPw杢zړ}̓if?@We? Mg@|\V˪s@pf+saqdzND=*w+h'Bor{BHE/R2R<*[ ?QثnL K7 50Sv[S+Iww WNPRkZ,o}LRXHnX)Q20O3STߙQ|>Ybqas%*̬!Ih.+R[!-WTsMyǍ-,I :F*%tmZK %U46ގAwt#A%׌(u SW343-(p͉,Ȩ@$ӛђsI0]87΅VrJp(D"oKw%Yb [zF5tql4j]IF H@*$#>Z 4Ĥ#C*X)s /DYE^u-{s`PA ̧=!U}~|@ܖd~krVe2LklJEs {$)łK4> h|h|?uR|Dr p~s L7߽5(G?M,ՙo!ҬRAK(Kmdʤy4._kxn >O,܌v^'%?| j]/+ "ka{ubnI%Kk#iQK?FIY7S:cH`W1[fuQ5*La͒?!UrOZPg˻XZŋ$ǭG@NV]jr7>٘,X/\/dwhRXUVɫ Nb>4q;cEm}<$;_pi,s_6,V\='U :,(y6.Esb˯ؗ1c XT~xL[u]EY/iK78)ˑN3$d9#*}Zsa\gŅ8kh`>=5}c>Y }[.F_\ i/Hq{ qe%)H,͕d0uƃLs_:$\0 .cs6o/;@svy9o &uRJ]]\Qim"'ts;}_h7&̇P kRnںYb(Sjz}Y#gY,%]$cNiO˔#;h >nmwO[!y '-s bܜo,ܣi˒l?exm&ڄb0da-R;̒OZFΧSwQE /C6bZ4'>*zNp!sCa9BЁ7 8UlngftZ*n'Z(n„K,괲("1NzrIPTR,0VTһA>r|'mQi>ۆ8ا@,0\`0*慄9zob>Dj>\hGXw§P˂aw ёVE=\Uݫbu@|}F 䁿FqaO.%.=Fدq_N\ `ĕx>E\wx. zK(w_#ʿDq%azVQay.uW< " 7Hrr"s5BXX8gt+cNB43AY-fƊ0z:ōV54|.jk֨ǞvQOw*R$VCFHIzG#t;f7n]d[SIIKh݋?޽{nn?т "\B1eY|Rk?~:!:v8|hyYs o,bǾtoƫbV{=p{#~ףoZ,t*iZ~_7\ݻ^bmRbͣR/Z#_`dY[?DBy߿\Ɋ+4MAލt./-Dņ00s,pCcǚ 8j @l/ ՝4'[9tgF,>|I]Cٵ!q@u+¥&LI8FCh2Rp-1K^#V;Ӑ"K5>TVߡ?>W55Yvæؓz0Y-H,?΍%#bU-Ǐڛhϗ[eEs ηn]kV*,(zU(FKʌY썡8nB64SZ@xxCB54SG*KʥDJN Ǥ"aV(͸JQUAPWdܾT(PܟhlI">i,.(C `+EALAS (`='sq)BjM(_5uYHOBw!X=&9 `+b1 f׻Bd{|dhFX1TE|`4 v^f0,H-{P JTwU|\mr..49_ԬEգRR"!|̌|K\7˲@LJ8K&U;˽$ZT|&Y}meks,-W-"{#%ܒaQ E_ S>7xS4@q`n"-r_"1L{>#Ydp;)+@8"a MM2f먻0Y90@0XP& -,It뻁AցqQ|A/>{Wy_cAgMlq/LLi;Az~\]TpV`kkMRXM*8__XW4ރe,x\ b2 Y~`[5JΏܙ@a1WZ2z [<>6,6tP|nWo &/ M.Yy3TWyVUt-~0>S#+sqsUo9C'Ι pˍ!>{7w7, .~[丳}ಀ1T)b+t@ ߜ <B%-Uv+.7(0 E} c}6\~Ҡې uHI7?Mb9,[sum꫹y;e44_b7%<-ñMW΃pT :NY8`{wF{l1w,*tYP (VM# [T(oRsX(~RG@)ȣS0S?JQj_lso>/PVITs,}@9Oy. )/4*nv*9l{ ZGK!zƌƁZBz4‚SO^Gn{mո'`:_QaΓ%p"‚-ݬIOnm1lPO#El,}Lp@<ݓwT1^rM uz[.wZ_wj:ۄ iI, ނG~p㛛~Eo=oGjOxX^>T+!NG Nz Z+e<{y/"80&ͣ,QP-[o߼nVD!EsFf_]dVImxJ˥J K}vWUvhAf/Vxz*Sp+Mк z `σv)<#6^5:;ܶI%~(;̣f|9 GKn(E!|D6ƒ ,)||o؇$"R1G X[ 1rbrV_"U}SxdhgngM~tÛ3PE?䝺 厴-}nv)} V3.DGot Him,7 $m $y[{a>~jo,7W0JkV#ܻs MH $,8X!Ht?'$D~OTHt?pIt?IOIt?'$D~OIt?'$D~OIt?́P@&'|Mt?'{Mt?'$Hu DTIt?'$D~OIt?O!a2uFt?J OBe r$<]@BcD7 5{_01Ӳ܇ /o]٫짾e(S>nMإ>B=.F&+?&6㹋|]2/sN`%i.@@+wi'Ux6 ݭ}roY(sThPn^i>񡩂t0O0j![8bR#=XH6h}BeBXhKCC󑵛DOnBR&(!B]Q#%n-]zX҉{>eJ{>q'=;Q5%"q'̗F}Q9+NR)f&( S 6tª¸^+VDg33v[ x>r bX0]G˽qO;%o}{O."L4tC|bbm+n,kGf &"ԟ_h]V򀯿}|)}﫬@*ê4KtIIc)&NkQY~]D%Yo'x:.Fkqcӻ.#l, Ee£x :ñQ^.^.-\.}\ ޷jɃ 2g0/fsЄj"|^I&{>K{kae*iQ&|{ =ϑo-2G=jrU;dԾC }(fBM#vWsb@o9>,F%)#}j * >\ IrxrVЇfT Pa p:B,D *nỴЇSܷR ϭ*7`-wh%}#s96 st<(h[ J^q*z }Rd`(aSyV8y BK$IkQ4`1-*r.\($f.-!iGԺ8od]ᜰm·Їlt(`Y{jNn[a#RO %· Bsڻc(@ &ʃ#I+~Tl]n}$S¡(A>DI8сsfXS]W==*bb.P-uli^J93 Q9*sY۷ЅF ѵ℁5P% t%5*GKܐHx'l=o\\"Ҡ(}-$2:0R.y-Qy8UEYAVLH.g{KR," !fEL8eJ} ]ϗ_Oڍa*yIR,UƲDi'ӊo60^85@̳24BI G dKd| ],kn @I dE 8(A)|OyCo# ܞ_cl/<UN@O!{mǡ7 q< U*tC#0k2ӣƇH}QOiZƣ Z 8G׮5[Az8&N79kcQP[ 7uЁ սV@&$S q4qY%Һr37YBs*΍&q389[ 5F>FBK^|PtUoqZyC,6VYm0FB+źз1{˃$:uL7E0^[u; yh il*(RGn7.d5%t:8PM2!o̠R!Hxg;t GCbUR"hJ˜(o![J=e@*.z_l۷ЅٮHUqVpZHոYZG6BB';5 *HD *s ULfCFB kNuA CBP˒ʤ"m$taڡschʄ )3EB*&Hx%Y׋xBTTHI)CJњlh# 5u]s^E `5(!Bq mx. !Sg ",Qi7Tum$tb<~G*r֚M62*WHN ]ouU2=uP5IKDub)n60ޱg6ɚ9Ț l @bjS!Hxm\RYNba\jऍ;.o!5¦X N.Gb\CQ!%,G+ U] ]8JA%Gi$0Ke(붑ЅD #?^ Yf(]͓!Dyoo4|oCۙP]'v6\aroC-=ۜ2ڙ+P3p:+peBm;\OWepJ!ֻ){ ;+|\=\M˥]p]7rԌBuBmjpk!ZSfwP]vP[PKpJh+.tg 5jW #+R1'Wױ+PP3pr+p*pZnP \)#ݥb7/63v쵼3pZfPKapJ+. vwAT.y\i|ҏm;\ •o;),kv@[ŶPKpB:z ^ܶ+j~t|WO_|SBu߆pJpZ*? \]/zQR<ynښGbWwr+q{o3'$;W;WWsW\q'LPbyo^ώfΉN9yk8&> rM c"Cu2w Q^F .4'gБL=y'8μgTd4"6z2Wxi:qj9"KBxT_cp;DSQ]vQcnOSnisig?C0h`ES5UgO֣A>}"V"V)G>󃲸ᙟē;Љ5<9rOP%K<#Bd)>JLxϩKIrO!~C+.|{zUތLK.7j[N.i;yy[/~'9f9n,^iP[pǯ7~4A8yV7xIu>lbtGkx4vW\#4+|*I$)cYdӥ0%HV:qJnݢ=2^,YCeC;P<[UL淥?DN`e9ezAjJ`F<&p*`=aʩWUW `?!Ʈp1C,~z].qٱzǐISb.gQ4jz`Y3է[ o쉧xൟ,4E6f\ p0~2>uSi2P `x\9 gQ >UnK`!`my=ÃoSrT">騚dŸs8>Vn}x`@&^<(:>:q\T} _uGˇY~݀ET Ⴥ@o|PK|gͮHzR=k^uC ^JCHSQ8RG"Kı"6\(FB^dET c귚vz7]'9~$yF}_ "O=!_e[kjvqn:q KaW^IP1J!m&1ǩL;Y)XiIz>(S"[a>g9MBh8) z(SP~V&G\KYk|C+h J V)_R遫yڟEM&H BO`Wis-6n}8\-W$sH 3D-O[de1 Ͳ ]>*, Y %s@IKH8+q PswEhr3ׄ5M1A뤊P2f "@Z(#5[]q(pyKRqD7aV@UL'SvBJ?eP(8ػ}1{oZM y*q,`A)9:HN(-p=M]aGa?Y@qAת΃B|bY/Ŏ/&,;9K ~?W}ی\#@Eys׿V~K4cxlY lYE]~Xei0ugforw\5$hb3a:x$7^A8tx~ 4(ԇeDcu> ^=rGff/!PKtHi>S2רeC =e6+ջ韃G>ٰ}|M=[[]f2ѱZ\5ϻ?f5"q,t?¾IQؐأ%oٳS349tri׋ z Y7Z3grj\7s*1ȨJҺݩ ـ<<݀fu8-UYDSN%jP8l'nR]^>z{*SGTꥍͺTW:pQi/*0I {J{**%O[z^6ֲFR*Z[xr&Ǭ8+A@S4K:S V4PGw:Č,BǔɅ&8u߅Π.1+#Zy b9EGSglyLi!8!碎tNq P[o^diؐ8OV˿yitɀ>b@JM2k"dx.)R_ ),Z [- 9탷?CN̚fNp/_apC+,JǪoXK 4`}eH]/ƽm KL;S$Zd DΎ$nfK\雷/Gy/=&>أ= 4Bb<7{6sZHE68(LFj^Yuxy;;9+(h'W?5~ݛ&퇍ܣM{4[4Wq0^n2: :J2Xryl-v no|W> ԭpJsyM, 5_T~֫^Nw!⮽7ُ_>籼䶭~Il+;lUG Njv{ٻn$W,,Hd!X <  ~vqvw_,ݺZے-FE)^|XYFθ;95ܽ}^(|o<|qӛ-I 3>AnmЇ.He+\uzTL59 R$H 6$uƺI*frl&"F¡)0h&ly;Hc\ YXHkNgb#(0tENrd#\~mQ#4O=)~>CS&fT|:c9k]SX&l1w}!߼>ٛn?HRѝMyO[ӶPʩCUqO.(dZI'HAK>(r +i]GGK838Άcܦ}vex!Frd_5x %cspu(f >W #C෗KE2K%UQUJ. 9Z1֢BriK5;P&0?\StoۡuK̵#Q"@~[&*ME)5[ʓiGAu/d4Xg{F@JVQ,p"}1 Xj1ɁRltƤV\;E۸U7 ͡07dpD芈vZŵ KL6H;mAvɝc\x,_^}:!jMO9^Ikg|v#\2_2[^#D]pZ< Y^+>~Qr>FrCS3?,f}wEl%kfN<ܑVuǞ lI>$gnwR}*I'z&.dLk1㏥XM~X0J8|k̈́#{Й3c-icl&ZT6ChL9fʔ(Hn:% 9f(tH~3MMzxՖmof7_8D:o6$x>D}93+i.4ڷF(menEfR8K-ސ>|g9=+c7-UDL*6l6UlZC.Iy( zg6g; "7 );niso[lr ISLhkrj}QZ<-0ˋ:[(,`lp-mFFU=,fM[r?%lۓ چ@ @30{[~uoq  hPO5VIvD.w]MŤxu ΏLZ|O.~\l`NL%O”n{1qO=n921ǒnTfeq=|sx2Vξ-9;a@@姩 ޵"ܿK7WoZz.yAeཀྵd2bF 챈ZZ#d*߭\_?w5b'׵]H/&R# ^^}7]_;U:fo/ߜJoZ74CZR`k\. vk:<\a%'׆hછܱ}u+t0Ы+˧?_iM|_9w k?evE,+6˫goמDƘU@31JFZ>w5ߟz~uߖ[ngNy/4$`˟v[5k+ctkɞii[i^1#봜FvQ]$KQE=m!zEӭ\;>TѼ&PNf\4Űw+=k+c+W0 z>xHNp^Pۭߊ]Usݥ,g_\̗}I?ineJht&U~˶KkċboX;0>RPr9ޙ.5U޴;9)͊Sy~'U>M<]d`DdqgͽOi7}һ<+y_Xysp-Zs+ގ&%`lȪ{dO .w{nt~^_ ᩯ߈?5/3SnGsuO%KF$GcVɉ)Rщ\ JqmN"L0o<_ϟ\}GgKgbUk%v7[su?nt \ \%٫5Zr\5L6hTlp|)bbjT klT, M[Rѹ .,ͫ$?bH:6h%s ܌˥M8 s(,FJY%͑!zZBJ ZcP-JP`L*F3JL+[`1OZu=(Ք/fm.IUDsVL$0)[=[cUk{,u_fBqalZ;7c6]@ߘuk%1q欪S]&lh3s`_eT<ڄTȣa7W7P 4y މ\1Ј*kbiM|ˡޡfJj1-䃃#L 0b 'h׿W1ϛiJ:[C'dH:ICv_/x}fsawܜ4zTeNcmZΩT#j&jZ,>„mط6n o#j(1į#ZkDUivʥ TCJ0@U1/2',[]OڥAJP'E<#34/fs_Y|ʵrUG-6KȤ1 f;xrL@]C{5=VB |.%QC@ʤUȗ ߂ j̔bR4}BnQ x *Up Nk!Nø3uqC@D9r?,j]rw%&Š˓ıD6b)\NS"u(tƅղƺx֭?SIY \6U2 FU s_!2iIi:dRZJYA hYX_wAqU&-ꜧu}9|ŏ4[Ulne*USXZbd2vuI4pNp%SiUČe&%KTT uԜ eq3FL[+@4l41 qC ݄Q*?%9 c2W9PI1, #D5mŭ ֐Lh_@J- 7(Sѝ*F@#)#Myhϲ/Ez@ߐPQRPSN(H6v^]T콒IӥВyę㠄2zYh $Q^! 력z2.%Mj\i":Sa>9niRYdzQC*wjp"B!L`#!ìv [u&>勏]`u m-b͸|oMk.n$I|tx K8<>6#Kb'J+MҫwSPCs ~XQ|PcQB'sAH)DN+̫3 7,Isp´1FcqT/GK: EcH!38pG_CƢC,TGת>Ϣk"=pn[QMuY 'QO6Vm{{//./N'sUzI'=N> ^{@#D#ExF$Sm{Ї2xi$*spPK|H0%v7(}X44Kr-h'R;HP!6vN$ءF$K&g|KlQNA AOʀGϱ?ml_1  3%2Dir9DPz&Ѓ\@ XPqY]PբcQyhJD5wJĪ)cr"b8[5ZV/`>`,ҡ=DQ5h@ec',J@)lVT*›6i,e`信n%, [ /V4#zT1dK-6mKzqnFYcql}]aI}S1z 6Mt43Q\9GW{GEŨն[c]k@<RV ]"j31L˶,<6i(-PHj7% xKd\0Đ 9 ^QnDh-!?(r9~]^fP2uI͏0 @$T5%i\4'kgH|9(Ws L| jo:k /5s;JUZC 3YXX0B Z   i7%xi)`dS:(2 LqTV3?x HZ6l!& XjӥΣ``.cd`V7lADCd ]bɮ7X 4`MQ~5 HTv"DoaL>b`ઁ# m-nLfk=(a^n &lXh0FG7y6{ο#_}t@&[D_U5oVy/ڲ˞}jKV#0vkIZa@I) v`)bm[z'&1>H=N sO&S1^`j DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 ^ }b :&]Q=@8GL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&28!OL u7L 8G)'&)2 8솘@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DLeY#2'{վ7L ;uNYb"a9 1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@Pz+ z=RSu[^_֋N~&7$B+5G%z{% )-=_ \PZ}PJKA`ۣT;+U_ 5GjG)O,UAWƻBiѧQJGS+ϕ}p. uy2~?\ˁ+3ƍO"0`I\aj7i; \&>2;&zs=+)Bq \;\^J+aGp{!zW(w*{pRFpup%6O lW(}+*}pRjKpup #wUIڰ?\S+m97}][?pJkűJ •[#2V3ӟT;\_ sW(ZsӁ++ G7]v0q{W<#m ތLVA? &cPTn[x2 |Z(]3hlm=T3կg-Ek0R|ENFe9K\R5)훷y{``~H~|d|p}qᚯwë+mq(0- +aH16U(r>js"^֞6""AWH)c767ԜRv]Kπ*Zvk WQ&Y%-nJ?ۢM.#Kgt.\J n{]N.jQWa.? g-ܞ:{3̫#ce.5NԨ ya͛ksܩ]o`.דXVM7юM籍/ګ|}ܪϓP]Zt>F?J~D 3^]Špv1ЕW]-dq dz雵A6MJ<<+|ܻg]l'|3Z(TLިC꽵C:?>dYT2Gݧ [e]Sa. re+rxT|+h,*|~8𮞯\^@!/M F[w~gћu*oo~2,^WK|0X S[+ϟFyg&o yfr;}8ӷ~G l[Y1b5MdԘ]uJ[yAtbU-zݴQ < vV@qߖq{kX׌uOR1K`Vt6`!F< B5և$A#cϵ5mcH)2gxqT#jl"(`yl2 pְ>sQu8N/3#eX8c)V7>H*bl8ts]L˨616mOnko{j6?otHۿ{[+AD!D_8ޕmBlvyu*ؐ l]>Y D?8h6sjiӏڿyb[^`Rvf8474ol0%K-{CzB*}5x=7"-X<͒7Wa;ā:b+g\­[le: ig)ge{a%(0Ԛ|!1Vb(^Ĭ+vG0X(x֮7o|IȠO>WvҾY^=2kڄoCN.'!wQe :Cvn@*N5mp1I.=݀4)4(̦tLZ_Ou<krWpOl?! gwPLj(-+^4Z<>6iMTl[.⊒ļ9sdaz?@j=m_ 0][-F?09n%x`%!ԗ|,}5?10Yf/&?^qθ0g_noëG>7"腩N3D]+k2Gc߅ֈmS5v՞q5L'#lhu.Fe1E$>e^]õ[;OCB{ѮNVVG׍a6>u۪E;QޱQenJnpG_ RjF,vB&ݻZk2Zݨ\yp |`T[8=Ze˖(ףmWc_I3u :8z9rW:tRad1物g0xX%)z sYRu&{ .D7>piVUcW狲VW&7 FOEfp:1 o w?}3|~ikzn<0mg4o a#?tmElS#؜j*Xf*ѝYa<$b:|_{m \\绑*\)Z8_ hmRxӣk΢?j\Rٍc5D)Bc8WJJ)~56ycIќU-hk)Rvpó)EkaW$gE%!:Ms4)geKegH)'ƻ޾۝h~'* : sxw"[o8ά/Ѹ"eBXQF J'kӻb1+8 ˰Ck=I9dۭrC7X1PҔdHY^P"!^ka>kř9Iۙ6\cIgdgQI ςsY4[ˣ,*t5Hm%=+=R`.;e0 ƽpZϤIITsOs"A8ٱu&ΖvvtْBhL6'VJW@ū52y^g+ Qd]/E@NXU%8*2>{U*sY䤪)h'ɘ*\*hIjujو$qu! w9@~j2p[Tp$Ȣc5gI5qTt aov=m0 ^Z)A,Ua!x3 k6gfSںˈlxf VBO:SVGRu.Rv.N%S.)RBǒlHPV6(l2Y0E,YW FtR;͜IR$}(U8XR: x9bӎ3gt> ٻ7n,W ;quh $!7 Y[kHnہ^RZ%(f`'JCsY.f;MKdX`i7r/:/k{}&xlρB4XZ\3rT:aM^|tHvzt{<2!yQgN= /^GH,lZt,G?zǰocq4RpFhAhW޽}]W|(J/ԩ`B_25,{5q{ze7+S2VhwiSd3vE|p>R?KEs>q YVȊɎ{^t hG7k' 7Ii601Z ԘwTb'8e=V W)F]л:ׅYI(*r-9˝Rp 3<_rں?d9n(ɭ*LrkΖ6sMNa|lN,Rٱzgi 8|>Z+kR$789s˭:F*_u:-cA,>IIݷPA&~4-$WfPqJ s\R|m"q .&_ P_d}/[gˇT̂mK|S_8C*#tq  A2t/?~7XrE\b5eWSďn9Fĵ)Rt&N-{ڨۤn*O1TY$e5ǹ3g.~~:;JKgiѻojʵr} QH +)֮ok'5*b_ǚ} |fw%Kg0W!~N:VJu؞M&j%F" nicDˊT_C8fԧ";YVFPyW,!.מͷkO&jZ{㜰T1'u.^|V*7V;!1& pRV-;"J޾sMpJP<*rQ;Oڞ,@KWCWJLhc buRttQ2Iҕ.5z=^a[L&)g0P#36A_=,of >ї^,r-ɽ.dji廖*ioèq~&}޻vϾ/@ q^wQͿ_$ &:MP+>=ht;_}Q?" 4\Gnv lM$t><Ԑ//͸a'_##A-,R3!&DqAf"l5;Kbs+y\/mb׌|^7 ]J'a abtBy Ti! {13ˌ3}0N48/s^`Ù] & aUͬ&.)׃~/Le~ ,A٬0̂h޵{@;.5%/:ÒnӸDƨiMf_osXskIMߘycRy(0m)5B`Q8yWX? j΅S`3Jsk*qσ[?;/ `,8B`)>W.,!b*gڤP"4 Ť%B"˸&KΑC*(z h~V~wfP2TzT@ºaNyo0iB\\8J(|f&HZnu>QcTxcH*HX(! K 0ZI-7qR'͝ww_2krr qU ʹNssRX! SgFy$դ lˑ+j`KH8֖u 9fKiBPҎՠ=y^m%W+*˴w1 Ú@bTAJQ4T LZrM7R ZooG]=&f":*FJ!@z5bSg )2%[# /ud)lP67&G;¥)9-WǞQJhbtS]q떪:y K LU_O<=_Mvz[|'*fdULZX-_K^Ydɱ7OϝЛ_vƗu/.WY3ϭ(AO"Z\WG05&1[(G@9c Z8c] / K6Lmpr^RBϪJKY5bec.j[ §,9 a l's׿-#o bnAx~r?iր?9- bvFOO?<;3b,ڷcԹ,l윚gq>Q }~se r?CMpXȵQ*g zer+g{I1iU6*K} 7$R:XI*%xeaԼxō{=VjUA;90*tD*'D \ Y$qq1j9,{DU8$Z8@s]S:ZVFc{ TG+vP#AjqG391Nm4k$rguk(G7tlߢ\Le+iů=KI-ZE&cHEZQyQI#jb]#ĎFK` ҎOBu~,ѴSaĐ J jVb$˄׌X Ts0ȓpmFzQpp]pcf_5 "W#d{^D.oze)F0Y޿&:Ve>f9q<׽MﯕЅSW%Kj,% !Ŭo{f{wᱸ</dX+Sn9^}D*:Kْ$?GE!-s}+$i kňK8`D"ҽE$ :6L{o$%V k]z)@ظC0OM_"Ė2D(BBe,YJL&0"~ |/2 :s^" /hP,(0Gy%T8 ySMwKI S"!&"-,xGI4 v3N* .S ;}zp.W_?Um#Qy欒 Bý!xJR h"0<5w歌WRZMМʖ 08V -v1Fa'rI |43˦ȁ,e)n߉ S,`3,:RuPigIv4A0ɵ rMb3f4j|tHbaJ軰c;mxV) 6ˤ9LؤaWIul3}-|~.jSN AEER]L-/&C5%(IgI{:Ay@{/7CoGəaR"ZЧ'gvGuo4 Oo#d w&C6+/7@ 2e {w<W sF0C.~ӟ\o̸_@u_&4-u 1Ì}Jb O{?Xwl&8_ fYWFa/V AM/@>+7f$\qW%Rnz`B>~Y>f܋}odSl7>D?`珹20G|T@"J3~ی|V1d8d4J RQ=H tXh+Үxн |íq@_'| K@O { d)$v3wS- 6M]`$(zf"?u ; k~?ZD~^DƮ34ZH;e2*]*SJČ*K("nLkhVȳ~FhG[@d&#QHY5Xaj3jX$"﵌FMFSyQF͗l8<eä$ksB*WU(G I 0ѹ~<0֎+L 2t%wz4cZZV FI)Nީ oS6Dsdx;v-Rqo7f1 YEi{WF~zygtkond57ts*ӚwyPw(t >#$PV>4g˟kʆ Uw~ld)nuyjSST2!koM758\R2vn->+ױHh:;t Ժˇ˧ dѐ13y^TjgQ0>6ѴutaMv6jRI#c2jc dm+fgHy\vUH(]WL\:<`#}=OR2BY  Ko g35%gN51GsZ =̴=GM q:EwWqwB_}p-;UWFqr"=x?N|krmu׉;~LI۱%U#%hׂ[m_vn0 Ip ^44(lC9]| գݠ}!|A;%uA.^~4k?<ѿJn(.EQ"L =1Vhi!P퀐rE؄ A2 ǨUpZ>8K8+F5pv< {^V-1Aw֧1o>)64LmJ:n@b!wiE ֫2p=801TǍU*(r*&:<ɒ63H^G*˥DJNcRɜRҌ; #Ay\gs ( ܪtPQ֘uw!bC$Y,KASc@)K\b(6юOBu~,Ѵ #Đ J jVb$˄ש,PI Op^^ácgQpp1ګ4gyXBUE ܽ/|`\~0O ^1tN{"ÿ +[N尓tT67VBOs?Vxj,r@L4۞]x,*xOe!L̆ϒrTX4&R/6%I~i^'IK,ʍ^+F\#'؇-" XЩjDB݆io(b8{Ī5.|l\]a2TnյX73Ef#A +JSjnHL{;xʧ S@gg@0`! H ˥CRp~(kO>% â~6z@ey梟i")b~ǻwIf:aVEMUiÿ,.M*<:b/`gdTJ `ٽy$#7/ ӳ 0H2gDkTCóg"Q8*c(U3 ]U!ɀPxW}̟ѿDf -exPQC{Njklbϩ^sRpܥt6jFAA@xK2a r7Z.mR7JfFVs4LP)^V&$i"Ls Ҟer|+}ۊ7Ӈ(kF Iź&1E}I_}jl'6<5V[[ZttNT{MP]~=o"[訕ƁӁיct2;kHE>Ӓ z&FB[^f+mrz*ep.`Y,!xE# ,(0GHr,YydcOL.DpCLD`YYh6(o r8=4 B׆%ïYZ; h~8/5wYlo`ٍ3gt`ESHN\vSPhCǼ0oelZMplp7[2 Zcf~۹cNTa}08" ̲*ޤGMLqXt߉ S,`3,:RoP:B\uVt a/\(wp)BSϘ8P=x:FЉ"iZ+I^{)f*3V-uaWIul3}-|~.jSNi,%"z@>o> hhpćqBHJPdHҮ>Zg!%G`) "#QD-\j5Sqc0SH ;kw5XJ'0)o#ZA۠FH"I`ci9I1ogMo ^t Z C+F{N:'}[7fϻ"uӭqZ0y¬\8$,a$xTvfLRګϮ=$RA11 Ƃ#+heWG1P@hn.cRI(+&-܃D)5t$I _3K`{`f{;h,0ۆG6EIžFQ)Rbx8 ȖŬȸB`IALUŌR8NbQΎKl~ Fŏ.h7b<+vIQTXgDE mQdUJsAOEJ scB%!&D68iJL w LV1W+L'UrXO; $wTp ܲȒc'%/ HuU{+:_'f`LC(ZRBr$9a" ɹzt̴WSUVnGk:\_1y·(]< 8!Πw BZ"x #z(8\|R>nRUţx,iK#brlMmލa /e<oi[-PMu$r:\⇌'|ZLfUjmY~tbumi B:3[NOtp>~̆tDU]9$Qv4N+ IͭBEH 'EP,6 WggX_1o?/.OZ~c[+\njFb[JuGo5 ۦYh#Nuysl{g9/JnBlrob;ĆgvVp2S; N3\WCwjAeT3wa;?aXuVhHW &'ea2~2f]qla m ~ao~, ]fܺkaRs2K kH-q*+ fUR/~KLҤOCKϵZxjLӑxQVVW`z"`p&*jl'.,ΞF rR 8?NOEpy_ao\?=tFq %IW)8v=tqz5?Ah3t8>9nњ+Inn%۩'C|Kuvg@Qx+8+%Atؓ!1CN4'Cnn^k{:wO9侅?`v`ɏ//#1 _3~K /MMޙ RVl;#s l":)Lr*xaTudV8|4ФkϻfܒLfo{kh!'=zl1Ŗ]˭!Ro&G ҈fl+l9IqLZ;pkj*Y97x٦0KmrvrY[sȬ8?"["T7_6\i]z_{Bھ¹_Nr@>&0jD T̍Z:j1;o֊{T~Ƀ޼See$$$' 2!hP\rRh'LAEFdL ɢut[$)N$(1)!d`$PO1&$* 7B(6VBfd/k|fy;yv:cU o_(|[ `[vlTх}Z=co4&*YU+ޖE TwF^b%' WnR*KghzB#!;͡$hE!Rc1D6kR SV> k: L 驠\A 20iT$zx:ہ#sDH'po+)o^ıK^Kgd6U`Sѷm^ | `|=$^^{ܜz^j\C(+w^gRxzRi|{aCpq L } {KЕV'S.yg+*GNP.$aN4CdH:cD[D{Sl %ͿEp,#Kk%7FPeZJF3v>zsVz|Wvfz}e ] ??sEm1YSFHob┡lA0ILֳ '!*T P ڹ*];JbFݦg>7lT}riUf?*Vu@bx<.2½geJx%f6"161aAX$)Vaʞ% H*MEcJFa@bLPhkQ1Hz+!aXU;#Di*HMJS`) > L2T(!c6-UI֥u CDZ)|l}i.v,R##Ⱦ0dn>Jc"=ogR#F0+=k s\owVD "8?{\'%5yL6u(ÚеQZϩԹݎ01# T0(è)9# <#&iuPFk!(TB% pփcj/NiNN,6"rC&CS]"eWޡ}NZuhЦlwVF^j>u7 ?ӖWMvs3׾5(.%Ed .yI4U`4SG1jccm끛z<eƌ4:zZgGD#ڦgow12 V)*! uNHcrR=:f+tt)}*Ui+#sARBEKytv0T48)iT2,: @Zp9KbۢKUrp69tcojn kp6xK̳n&W\i8:e.ݘ:9ePMl" 56ҩujH6ҩtj#HzG3H&ʷ^St4Pz.rM5h4S6 $b4=w9ee =IueND:8!9'^p,zR"UDj10kKlQ>piԚ:\h$ޠ"89NP)LQWFK(wYj:;Z͝ymò<}yY/H$E )i"UrA"T"4^$v4]|&ƹ&ՊB;C+ri-y B[֠BX!5RYP۫9+-C!Κbϵe&a_ ʬϏE;z;N]~vf N5G5)2ڧA}mm͆Zh>ޥק\|B$6Sx;lKRi?}mm+k &~ u?NfAͼ &mZH0޽?پws)+ ޏw7f"߅iJԺ's">HN玵Vy0J(ǡn'2 7ÍOЦ\@6E .A.*u J{O:|,0+@ՇO?o:tyֵ6sZqy$LX;5YuմBU9|T\~ P^༫Z'r[ڳV7cD/whӛ\] ݀~no׶@\0c5Oj|GlڟtMr*=*,Q0 I@66/>?ci?Ixf8 $be$peܸG6r㫿3ϔm_ u ]F.(h=O3/,l-/ڠdk>D!+a`zxn OkI")Pli_ud4xMD*(5`Bm LȅE^kxZmeQQO$!;gyk.$)nrs@/+(dnN1d~4C D/\h#/.^ᕿ7 "ūa8z_^'~8]:ߟ~|yXo8nogR˥kN\L-=_qUVkY>l.,/d^ }:{t~:LbZxo[{a-u{\~Q^Ouj愚˟!p'{}ǙӜS5IyHYEWsQb\ǽNz)aGY}f' Ƥٗ"j:n;W%Y<'ST*DA p~:7*wH-x:-9T;?,mj6.w A]-)>G\6/x뭿}3i9| m1|7}nr۟=ن|7r߄c^0Ԋj]srߪJx" h~4HKd4s85/RWZ8AMx'ߴDkUBS9 B,3.i`u!βQFGυV()5; D锨ȹ5BhS(8@sY'sɼ2o}j}wZ6t3߯?7U/P4P! m5e MZpQ0VFQZAZk8 ;F梚;x@=,03<ߪ\8J&ijb (w[ %Q"69k\r6խL*յگ5WwgvG3M[vrk&#.^.յ`h{ӹwz ?>e9 >]Uq:.ٝyl >9ѺۦNˇw7>;ߣ'x4v#i9}xgͫ;6^5pe[<9}=ov%hLxn=b 9 *OVR 9 $Rq]ؐ*F)xqXiٟ8RZ=jvy*ۂN\?uz9lRKW0-@ibW9XE'LX]]来/-XaE'Td( +/#Z0H\ƳO.e 빱1$˩tM}L@xF?N?P+F}dqkfRU:_~%q{? 3NJoE~76h1j%q9f8}}H [.iA}fZ u]fMKpN)%\{[2UKp .%Ї3¡]eKl -%\Py`Tqp~tu563W6>!c? ?Od6GO BЫM|  ob6kn)󨂴ɓ*'*UVT %\d{LXV[?kh]LG{p.~S)T7hOZ̭Ϛ&+N-v YSK2Uj 55Ouɝϙ.tf}Fn fϚt=r:#J8Y}44uD鳦ҝfZigMg,ELI^5H~->k"GJgM 'E_9k:i^G?g(^7s=P#Άo5x[w>;tndI\KiI\%mpI\%mpI\|OロY {iVs?4lҬ2=j%+iV4,k:W/ۈPIED V7ed,d,d,d,!$ɕN,}&ǽ#} Y%$ "!P1@C jR 3),8,YַqM><$,aIXVz^&xb8/70᫮ {EӱMxV +L+IX֢ Kx[on>T`OAvFS=WٱQY+ص ۲sN:B ܂a3+TBJh +aSw~tu3,/]q8!vk%+N o$ȩqux}2.[u}[xkuNO=';W1HL451sDJ)P5rLvuOv8BOs~9]2hIG%k2>9.h-D<;\pJ'ShoTcg;ƟGM9ScݞQ);rߦ&gvMd »%Uorޞ*#ZXx e+zP!ǂ9'+X[J؄R؊=PkOZ(mzw6lT57덢*ۛ|-':\ e/^?F׽?珗3/T~ˋ˽ ]|wK.N- A;n :pƲ B 5&Iy/iL@=hOqQN'l'.2DhB$<8Ssfd>Pp kMN+x *EH\4qFIĭ; MnMZvλ6Oԕ2U\;Q;"> e'Ã0R!;y !,p; <$… $D I4MZeX}2M3ʊHkl2{O&Peh%B4T8'pgŠ;]KPHKqK+Q:G*m rRaLAt\ru1Du!CP5Z:!O< q U1row|\%QC gWyLK-fh&SCNh"DM5)cZH)0 ;NJ\Y"Ug*m>H+$H8:rDN.jg1)P"Rqݲͽ| ZF@;L@́Qm%$ -]*x' 0)|$<66Ao&Yz;춉T#{sv.mS{4.K7z4@"3≜A0l_Dq#_! l4uI&l6LQmeIvA}_iDbQ|W\<~42}"9nu 6v uCp@;E=(+w6S|&u7u7M[M"Dc: ּ:@%_F>tz]Ni n5lHtDxCq:ޝF}#3/YW &[:"LN vdnôf,j/?)3L]j&[w?! EQ3ö킐&x$M\蚕Ny.C T!wu%My^v]6os@'nMrK[>fR(m|V#$e&|6_svJ.Q&=hu{]Z D _ը."t$@M&WLQWعk6xI@ ֋mOfl%FῂQb7ZMƚTT܏r2x<6 ݔwٕCAl4D.UwHحFk_anceKmW}u5o.U^=|2'笔.c (uJ43AY-frh`N 6tª̸Z+blCzQ$#/guNXW ;!hdq)Y )6= !TO2M!LP`p˼ 80#HLx͈\@5'!< y -HG>+ +LKDg, r8bEN"yL"txLB?K'A\]F'ZÎGah||f-,>5O,% !W&f%wUk%pl)׋q.\| D*Sڂ%F|9tušTnr)i&,Bp} ݆$6}An s>&p"IV߾z]H_v0|lq_"<^lcd'MЯ-5z93Ԓ+VjO]IiJ;\*A"s,$tyUWIGpñ"ə!&My73>T5 wIbMG*si2O7ܙ+prhd\L]3 ůr:s9 8sp޼-$p!"tt& -]u?[hy;poayp亡vy]&?6.),ؿ fjv7x0)%| }"آk+nЋӅw7&砎OdzI)"@d/ߣ*.R-cc\V\,0 NfYҩVCDȉuq<.\4WTS]AU;XO Oee/LgD!e!x0k5f,`!0dhj4#͇9uܜ:|bK^-Ō=vbT '3Rnu~+8+f-cƍՇv_Giy2;A+ZFґ㨷jr`.~?&O({3{znl>4B|`KOpcAbUxjY={bҬwd&Ys6٣U37;+x9SݨPɤL{^[Rc|u EaLjA!RǛlw̽^]vբ5jfN_ǮQ=ƥ\Ɵ\/@7>XoITϹH;:$aR/1"X(VQ/5eDdhA #(H8ɐmmc^/jG-q'ֺ߼&^tۋ߭6[A8(AYS%]M'7g)7X =!C{fhp%J7bi1g;J2ðquwm_HT4?TmI߻ҞI >mJN.F.s f'/'ku'[ V',;Kbyy0ma֫M>2u=}u ku(COeJII˭-w':)|5F' 1›A(\v5<ƽOm T&2 R ]`7]IxckնUQm9bJbDAi_ɞLs3 8 @Ts89Ah/ks*B{-X`-z C~? {zNr125>LFN2;|s\7Wr) ႔(c'NL6L jD[ׄZ_l~ ^?/ W$ v?CI' _LyɤGc$ݚE[d`JZc#58B+J|Fc,4gH]֨.fmQWZ]]E(+TWjQñj'b>/*~dkLcڢecGg&hg(/JLB0҇~_?ЛSk \i;=$UKݡ:mcWh5gSj:C)B6eFTqd lZrYe#jϲ I)ZX3RՊ6'Xв_PrE40nhur$ۢ"UvQWU#=G KeՁ]/>ZE^D]>bN]Cz-RW05 DE]EhN;]]RN]BuE0VϺ[|su-*BԹ+RԩW(fI]E%o>k.I*B+λzHT$=a<ǁ~8%_7~yY2a-sf HeF@ .Z> w6Sw| ͘Z*˘^k@<̑ &#8>&8gAxvҴz2O@q70Dz"A< 1[^TR!U  EtʼeAΦk5gN_( 8&W`@ q# u6hT4s@BY&$(BO2TQ#k!FNo oyUBMRK-I!2z)E΢*DLa°FB[e$Q!R dtN%vZm6ĪpI)[u< FmqaɅuD=4z ᠸ\$u&F^(nHȸN$=sFYϲxGO G٬=:ix<;UiĴEU1y qIZRxISP )i'j#X'md[t:1,)`k U cJ#686HF_j ;@z0@bwUɯ z~S;52]NܟW}1 I{& 1PUI6 9ׂ?I_v0|,?%Խ`'V[qSuyTۖ9 m^:gJ(a%tDD7i(d)M .0pAS pyQ?p2ݹX'XzZ+c>" *~h̎s;"w5C+aDxp&@ΤaI'>죥ÔCM2JRC,r`Kq'}qSɩhkCу#JR>@\ 8g8_ISBq3v@ L)&-d85>KlZ|=HR !/^|] 2:Ř'2&&N d=!-4DEwJra'ZC]{rbDfٙWoV~"ۻ-DmAFѰ2>dx8 ]d{8r+JDl "161aAX$B+$vITTT 42SfB]LIFقvD>p<+1k(!J  6)MyԂH>`(PTs0! eBsq<9紞iga`~u%vZUg<;>`{D:R wcfIn]FxTHj"h( <=o߼?ֶ ^lggOз`cK{EWk&FS-aO:BImTs*"H9* y5%'|;"cD1D{Z !*'P+`%2KRx ΆǎГ (WKk_ ZsC:e7P[yќ^vLos9:fTef@V9 "!H!p@`!zJOLx+B"6JAsng[ Tꯎ{>b+Nsq;G(UxsH:%&3x Hr֑Na\3Δ%9Ij2Z[U,I#I@[JmYx7GMCVr$9a" ə0zf+k:\WU깭;v E ] uF"Oe>1wg`NEc3NAHKޠay޼0 5h.U={t 0v(w)'mkpplRTMy"䪉%c1e]R2HMlUK/xrPHy9K]NΘB" ҟ$MBZ$R ͵&V8-Fz*(DpB tڃ *w:v~vjK8eKQlX/^CMn Ie P4~.JT=:::2TlM%bZ_h)ԳI ЁrkSϭe[+kHE?WY2/Fh'-,jT-S"^xX_2Fa3{frGms5,Yg6 <&4>aP(r]L[nw7IYi}?T5K~ f7+ 1 j0^\yy߱H8M|Mo&/$ӠOL,U=`5Q[LOhAgjB]l0cL!PzwKnPybή&-[Dmz3'dwbHi SUF yU&ޕ%\r:=lܛk'QS@x*/u @px: uƳj4f]Ely?dq\ZTq2}\\l%٨̄GM_ip!:R_Ke@yW9T bZFكz{ }8 Sz:up\tc9j̄tYo&TuQ944oquPy?g`@Hb VFb A[VԕWLnG)m,2.wLqt zCk񁊞'ǙW6PWog |{,Fk^՛ç~_zTR4MDi婰$j=J{ƌ.1)it0kI&^Ix>UG0s8Nȇ`]q\d1jd.MoF]dQK/VZk) + qe>q3e{~[ IvXI^vtŏ &HX_{w;@0ȉEW0̰]ֆQg x?>}[r]mK_//q |c8c~vZ}tA-Azv$^R+R R1=j~ΓWi&W4oy!?:047''Wm©>YFig*Z+^r~?=ovs(Q.,b`䋲58TQEggXU~ lԻ`*H(3{8i *Ŵ0x^dC#bS}zcu͝GTt\_0?7"uOE<Ӌw(CN\]wKYdsڣ/,Fz!R @E*D [%!oMa~9\V&RFy:%*EKZyYL^؞>]'@gX2|X*i3mFLd iDoGxCOFTvQF2ԅՔ")SRgG$|NS_sZ48ct_'1ϱMBDjA iVDDĈD[O)tLr&Dv[!^8%3xC-@rR!Db:QD%,wVV 【ΆgS-OD}qlzW]թnQ{{:%F9_9+pG:[lgbgh쎮-דYgkj3 NGWܸ v b8 )c)rnaBrԵTt;vm]$Ŝr;=L'-N 4~G)&ہ¥ ϛܞhߌ^~y}s>N6,o[MF:d}rm#hC@gzG_ibe 4#ҥ)쪮II>dQeʢ,6EAf_DQ=Wj"^Mīx5B@*ˬ5bR3[jfKl-5fEF0^3[jfKl-5f̖R3[jfKl-5f̖SafD?ևn9&Ńhֹ֐ /zS?z%;3Eǐʯ>N ˢ[ٚ?Z1{r l|NiHaTfLM^ ՗<5D%!*a ,uR pFx%ЃD\_sa׫Ύ^_ƥ Œi]ڭxqw3\}@Ur3HbીqyiMR(EgvSTYly鷎~&wv^?pU[.[]%sdVy:\󬫄vʼn + IX&Z^~l:Xz_Շ)tQ2}qMj{}"{}"˓^>X|T-7$-3K2RsKot~doy1sG&ai>nqHy|vϹaG~\ߖ"f.O<2IUU +mɻ9ieV|&b'ǞytKsbrKL!MO78ӓ_P/l:mr۞\)$5DBZah`!ؘ=Xȵ\FjAnA#ʠ.gm I,P&6*hϲ1V7Lmv!Hf"uzG)q9]v7ꨝ;(b$8MD*,Xv<$ӨGlh+5IZȃ.;@% u7m|3ΐOyM|RN* h~#觩|5[E>MwL}Z_1/qAVek θK]V|_LJ ٔpg`zE G0xdm#jwzqunM|oxַVϵor{F!|en&X{e/|hz]9/] * \6b b?> =2_3lzl}D.0](D=go}+49uUY$y|S+ɕ hOc̿vţUdkT2ӉH?(7"BvWza$CvJE#!|`RK[o䞜/&LR픬FӊCtL-&"&f \9 ;%k/& J0l J1u{:G7g!<w(r-)V=ٍIC74[SB}q|Thk'Ol c񞂏 1O1ծw>~&, %]_N/?I"F̍ 5*xCF ?+sxD 0Q.GthhdLat%8~yv%>KKmbv6ǣbjʶg>.G_bֹV$Ba$Ϛ{;+YP) %HEFd<~ң:.O71.<уw9 ^BhdΑ%7rvKF2"/w3}vUm6qm~E,OYZAwr(i/qjhL!Ně!( FYY# "M)sB)[ N徾t?P[zv FFCjw:o<9L",<28qϳ hbp|LH2E3D ۄN{v;,G \M)wngO)n'Gbgnb2E(y_Zvw-bm~~^}vc7%pngKz7J&ٱ:X.ƶ"!-ѐ]~|}S8l 5j2՟{Xj"+ bD2?Ȃrbb)E1?Yi9&++;F#g & ZL:K^܈^r#ֵАZⰨV#m}e7(9v"GFB@ gNŷƧQڼѿ{@?u1%kx3$Kҽt1~j®xe 40Yو&Q7G&a 5|5"IkxQs27w0۪3ΰۙE-G4: `;!EJ!l2'sNz۲+oG5,nKJB.w̸LF 7PڎXlt^՝EŬ7$E$y),M)+YQ gw7rvh~d"z3v;_@ݕui /gmZ;tzyȽjF9IAʒ $A۔]).R.ds|.MJoUzTƻyz7EuY|ئnW21l/qB'EdVDa> &CR F.LxB9T䈙E }5jeߐemti-ɘrK A$L( X~1R։E±hCžU{C23f"ZI\&D.RРV"HpV(+*U2V%J1%%BcPbz0 eڀ.fvM]8QꂐZem>ZGo?~fRC@&HÙ:].(V&Mj)e#-,FԂ c`) ctmXY`!~ ߉٤gWkiuhW鷎o*gEjroxrUsz}2$HdrL/I ((V4qSF\y6# F}|T)G[ Ѯh4v֗7%kT*74J1VrQ15sx1XʑI9)*:\ hojD>|^!'&gKmv6dK+N@yڇ;ZGl绛Q6}OT\8VgDΜL_*Қ) Qe076JԔ dR=O |Whj>2;BS =(PcHK7Fka<"Jd kwL6"r% -U9{2\<1H`Al4`Rc2^er0R#(8ƜVr۠UPސ;+7r"#:vAesd̃/-;[QdZSٱQQD1iG]FR#P"(I6 sI+ΌJno,ۓ*gIߎ< ΍fn-Yh$] BDYzW9tMi1؈Ti6` 3I  (8ٳFΞr6чjfF2qp!ٍ`6Exv2f}6}Bq*zPuzo׿|xIMN4\,>})Xf/ lr E߆û&L wD1v_T)eC?}{oaY>^p0l\_]=^WJD]_]e0]g~cޕq3sﲍlپ'ӁLƔW$j!0;ٸmdO֖蒏APRIp`9wl]T1YrR sQm [($Oh RJHD2$?$#IGB/ 5V-YQʐ9Nzgo"(U0m!t[(4Df-x *d5J@%E"[1b ss <-tŸd%DIl&&1ԢBelr">=P["RD'tIXE^DAѣ-,t-D7o2oǗmٕ4ߥNɎ]M>h4I8J i;HW > alL.쁝&·*!4{t Ŷ6uE1&w)b *2ǢRR:KkVf^{dOex} @-~r ;Ϯ-!gO`;sїl*-/,-x8)Vl1?Ef3xhzixO6DH C}a]dߙhUO~*KP[SBZ4+j9 Bk3)PIg[WZ0hSXtvAE`UEk> XDh ?-g; Qݍ4{a}y7:[۞3vqr]r=DY#Ӭ{5$2c EV>I R{ m1 \∪mM<iJ)y56 $xTWRTEfٮ+q;"G}YW+l7x)>ˋC7ꖷ )ycǒvE![pG% .iSl;FdT*&yBg8 e1ŗۉ'ÇY)Hos}/D-u|y0n!_KmܶQ-9GTϝm@T`}0Λ*a.x_h\7XٙJybԥT' ,Xs귢oMϩߌӠh c $BQHK4"xthިD#ymd4}nD܈VNN,w)1}6`?K:#j'C#AnOR}.e=⬧ϒ^$%Y![A=G`cx20)~#2Ek$Sh9~L&'^AkQp5.厪_'TLN^TTGf)Ƿ9fgg\WOsZ_yx]ɹ$l}.Z* qZElQJ^vԮŊ'kN3qߺz8 n o{]WX~J蔔V.hB ‡ [@[؀'{oz/4"&o$0fg*A8jrLZE#SJbdBÞ:;ұT2e(IR)ȒNGd8$J-tlsivHM;k ;R,*1OPviŴg;>H 3}P.W^ܢQ6$SP]*vV oL)bSV3Ne\v(Y$2bHlI%:kQ$1hA:(bਈ,s- 95Q"tIZ2bfَ&'ք}nF A7?رzށsa1 ֠j>iޮB0yɶdJ;+T[,Du`^yQz{Qn&k^-Ȥ7oB>*[oX3ψ.kg«*uO3QeO/t'Z݇1WE .D;-%K$rZTw#/g{H.}v>dC.IU gA劎t?(>#ou}0w`)g۷cU[s7Pt%rWvTI5F6z=9\i )i5T8ޯ\zꙙH8 b IKP7V/\,%2aCS'\rhs0h,ؒ')pA).H-C?غCqP6d K/7֒ԅ`>e{VѓZSpNϯ>~]U%'jp&r͋F-fPLh6#$k|qr? K@G&Ƌ5*l-hT)S,:^UEI,, c?֪u$Yshhs1n67 i.ֿ=+ =#;EYezszs0ޓe+S\?Oq^pP8ӭR#_Q=͚aS-r{\by<|YF.XRR< :L;5"jv+ݶְ&m_lm[ظӳA{Ls>z+wJVxo^tBYcyMe;:>oؘ1K0s˯$<87w:י^,{mrfvq!gr1ɻgͲRS1eV8?ṿObi u]D tңעe*i^f<58Ny48 gGC6.:r+{"[ N$. O sn2"8߶ ϑ[bOq3xUgwܣ2-l:y[u9IP쟧^GJ(|]>g#SN:;( S6 H+,wPeH?g?k>DX&PIBTOe2,%钋r PH$Y<=c= |ʎmef[ZΎ 9?!!acy4ce`DIdْ/ud|J$sF41?{Ƒ@?#!MrIp{ B?e)RP*W5CJ,J4Hk ؖÙ_=ʱMYAM| :4Y3;|:[GjYNXlANY-J0RXevcPȸ0!F;RI^nI*&"G!RES!;K q=GQ(z;/$ \i~L fYѡj䘨LTWk`P Avk 9:Xc%>LqVuxN2U>>~m(547UR7 .G'?|XOSza~tT}6Rs^5V"Bo0G8Gʪ.I,Iظw*{q{~ EBZCƎ%Nmɵ?8 pq[kker h=h RA zW_E1n"jNC_'la:[E>,:"NlR>w[Q θGck~h8f:dƾ^|A#17QĬ5j9Q:JGu;"ɼ]nݟ!#φh;I{m?ne!7-OzuhIu9׿7V6ZǝST9F qUpN?P ~;C^@]Ofx_SKo`ul0o5M^u$T5M@/KcG_-48P2י9'u^|6F6 ͇ӿ ?m)Ի2#)$]0L)_&YaI˄lvJF݊ly^bxNh41cMKWW^[J*sBlrQ׆o/߭J'^|_k4M!4cMQH΀, Dhmnb[W 2NQK#(r985i]?JwP..#2zK!!r*+\丼Ww+Ylv// F-g7I֫~$_L˛3]A˱ņ]~]O~i,_\U3?M|7?pϺ}.Dkne :GowYP^zRj!BrQvZnJzpe-%\G..{ՄI^X)L[A"Du&fK"<@}7w噿#z݂.N[F/Ij>NOErAP.걝<>}QWG07^8a^3{űcchRXMC.5A %'}z\FgYX}H%n'쫍|<"ػhL]hZclGCTJ6x4УqgS\!RNUlWx1Qzp n \~gZĶQ%;ծÕp>3Jpu/g/ W#WQk^GYWp%zs+;W՘+"]+spET ;++ab w\)wZJ;XW]Ʈr!mKb W_n.F{ ZXolPL!C}CҧL fg.3棕A8S*b0)YKڱrq1,lcǓ&$ѩËIs]12:N0¸FSߢK1bff;8B޷ad]xWg{nE+ <֎W+Rp&,^ 1 Q?PPX7(xku KmXeByn=sABB%JRv cx*mX*.<QR1q6ݦxȄ쁅z/,3|Η'W+C7 oqYlMɉ %Kᄴ Ң*"Iu7nH !/AvхX3-lUc7Z[mǘ=1ǿL ϰuWh}~=Vm Y{8^Ga6 qq@;mޥa4IR<" V3\`iէzlׯoܳLV4 ;>' 5H"sq2bD[Dgo %Pb2x2}¼{wSfήg#S =3>z&8 XpT20$fSՖ'd_]Q7PzAEA(>.NϚOH^켭jmg.0;o|]'}&%k %~ӴrT{?ⱉ"q\bdt(8|KN\|ѕlsB`=#co܍[)/S`/x(~P  /(hlfTatwYIq~0O?G @>T:YRHYȂhdT4*[-1P7%Aaɺ>|kvl>$hcCNS $fWZEa էKK$,\tf =(!!a/d 18c*4A+}1l($H10Jp)!.ZD%5 6HSJNKEBGh'-Dm8z^V$n +b{=%wtL".y/Ef(j%ܲe EF>Aۜy@-듓Z30Z -a 6p%+iO8mDJcRJ *fb`:E귧6gRR6겗hH >1/<6jU0Vrx }닜 C#"gjG2*HJkGtݢƥQlUR&C&MI1%R¿) }gF InVl/O9{E'Ypn4pkyPl)Bګ(Dʰ|`YgJ+0NQeոLDN1WuDHYz#gC9ۺe[?ޖfhވFdsb /Kd6\<$)$^g+ P)) p_̥T8T*sY䤊)h'Fɘ"*rI՛Ok5>xB w9D~J2p *8`&I/(AXPPfph`~v- ăJqM { |D^ZF)(xe@;GcJhh\9`%2H[l5KޚlXz  SP3 St2"[T΀.KQ|6h4vģ?0ƶWom񪂠g )m𖺽^f\|Xu۱>Kg\lDi CY&$ &/lឮ.^tZ-֧Zy!#EڠAR\5jPG@,C-ܗkR֦JbRW#W\j$pNAx{*tT3hb"#X%=hrRq&{$f!ȹ۴H_H]K;fkj}k]:D(9uzqx9sgIy:lZBsrO`-y +3&2VDGkc @ܐ}O|_+&=I%vx>^ErudQI,ZTcDIEi{TN;4J%Ƹ7 vB[e,N! 4hDjo01[YV}."gr|E3ҷB[,Wt_֋V"+DTƂvT -seB*MYu4M| :*v PF!" J5 ղ$eyTq^ܢD@@AjF`cVw_yl ca2t<nI*&"G!RES!;K q=U"Q(z/$ \i*L fYѡjՋQ&{%FZP9I(ye ԯ =(i53,޸DXg)FXInYj{XP8k4i(oƥnF v]O~챶&=vv~:v$aB'm'ؽj'nDW2M&m˄镰fͽ3wًs}>_w44.R׷4v,yt:m4HaIhmm5gq[ŠC)IQcWOm (0Dꫲ;Ɛ>M@iHA$?Lgc3sӛ%SW‰ uPܡoE>&8u2i?4 ~3Grc_/>[ PPKTRXĬ5j9Q:JGu;"IېssX8zUhmxEsZ\nh'U88ekܴn~׫CNeЪrśG+Eq'@UN{-cI8EhpfUC3`Lmjf]8;">4ٕZ&W~]#*\JzZJXS- "n@}<|.UQQrA%3ˉysR'kcd|A.uR)wwmmHyb,<, 32/^Oۑ-$[v%j Dݢjv]"(26Pa|`RP+dE|`F,Nk /w!vtI:804 cMi@&+jg$F"ƣƽ&{Jho$b|r5GH29(:+т@ӕ%89F]6]#1bw7ujC dPdVcj:@XbA2]wY78oH,shvӗ>R }ӧO߾MRCP5jwod2=vDZ'BZ=^meۻaOx]ɵb%=FV@b&nX@,T<,Gq/:@w־ҎRDSPSt(k)hrE1wh=Q9(Z[s%X Q,Xc*ǚ}{RoSi-miif@i+Z[f\s̥S!S΋:\$1H^J۳䗛a >(@6d&I67ٍՐ"AKF?Zv%eJ &gd+lN*+<wާ%Wr1''Ŋ\X()eCR,)&YRlTg퓅T<`QqЏu ue~vss{s\m 14 ?ycG1?ﶀ|ݟ2Ÿ?y|K̓j\',íΪ7{X7>Ps01wyvLٿ `M^gʨHQ8D+u]kНVvڋo lgs3$(b6bHTQI+ktde& }3|hnkŸ>~Pf?|c1'AJ\Uh /}E\4,CGP٫.(,|C *&_P(FY*͑m,AEL1&YLah#b(p"cl~N 2ƾ> [,++4Yy||C^O1Z80J#) PVuKtIFFF 4a]deyq7]MtX:cu z`{G(Qg혗)/G0/gwcT|x*c3z:.A=4][% |}lonz}O1Z%+Cϐl 9Ra^R9.@Is (UHч$b]"UIҗNc}@Yw m[O'ޕIJ' Nd B@53ϊllX̾.d!PS53$s Eq1Bײ(crcA5)M)0.rh&wZVВ3X6scy}7?W{^,`dy? _tvʭ%k|\474!?%!\!B RTYDM"Q*潩f4dKJ%AڳPXHMٹGos2(/F_XI֌ٮ7f.PQUST[B^;MtgqQşdrjv5Am[ P?@@AUh0/wv221-n1Њ;\r(Zw EkèGBUmtңIԵOc`"{~/N"-~l>-QF{,ceJbT*H!HM6ҪL1!Dl!ckoFI%mcJYK>m_z>ßaWŃǬJ' Vbln+|n\+ΦF@pIJeǤwtJ汕|;SZ >pK%A6rUrp!E n(]e}kg5L.(Cjh 6%IVǐd ^FC"ڠ)PV*)8!NJ4ƪjdl9;LY#bؾ\t%*vlSOf^h.m;꒬;͕gv'W7kNQhUjQBL!d 08M$fa#\(хHLr;F5ӫ8{{*P|]~IZ@Fh) /zdOR` Fo E,EIi1d`7iZ͠V~' ﰳc.h$ "Kkw]arx-44*&c%OnZƽwގw19V/L>dA-HIaDpͦQ*d(mvDoR6*ԌE>j,|`ă7HxbtQ;9 ${ H&kG5T3ؕfvqT܌C&j۬'yHAqԁsǥkz Q cl܉b4J+օL/^4gZHBce$$8 @dơՒ%޿>áD")befOL=~U]BD4^!@n`!m@Bb(5T*>mbc0&fȑI49[ T $Bt@ *M׭}^7XOOu^y)y^'X-@B H ) g sʌqtRY Cd$I "3\ XھZ9rA[.*_H>-"ˁCt6lRRi3qKO`Y@M&(RWh.4qeV*nn%})R%sci%:PYù!@OZ~gcKPDmK6.OYט@ddEB'˨UBzܧDba#O}ā_K5€yNpeK(HR tr QP:%Kbyx.#=+h;\;J\~V<,LcoK|oӢ8m{3287Iqv= 6ӚEŴE<8N6뿵 Kpqrc$-Q'oܚ wVe/ywǶHB]\ uݷ6U3NO;?AE}?:_w-^Lodqw⬤;nɶ*>&8㒦gﮌ. oٜt;l =1 Kze"3{Ji7{GaJi9G-+ŎG%Y!{vD*}|dY!Fucm[3VR|-f({`W9 i]m[O10_H7ץ"qD*B0}(zJOVy'4Y/(@)* G5w9FݝZKh[Zci{}dS?.V?9Xd ]酖VIz kNAf/E1ho+ A?LKQ痻;zI*8)hbI i{ѶfVTʚh-Qk﷊(Ix>76]~$CÉlIp:ZhWz7q`S%"4F,KFBdNf4$VZ'!L׎CғO+W $]},%MMzKY6<}+}0?3UVE ]QkQ;UNJA*1.T\J7VA*}W0gar&qGK"z.]U%,rT1yeL .AB霹*Ŝ-8ƵY#sp qV/Ž՟ӯ־sr3{]9YfY\^VQ~z'zhêCEC]_=Y Jeq:7!e20`c$>sN|zvt6O|bGIyZꧏ'%X qg,iÓs1$Ka<*0)+.h7L-eJKd* l!\} L(;KRx k9е:qw.}޿u{ߵsafLiçq;6=2n}B.~w `4f1z8'=>_gtl' \g=|]k.v+#wn|OwH{>B:,C{&NoiO•]{ C '>,nq}]3hg򜎠6槲͝Z[֟eN]JO%[U 7 [*i0]w_'+=JLp,}I.xW*7hd,::%Ʌ5?}7㻫/Zޚ o쇋K];EuMB Ycɠ72MC>8"Ȥ5HAP0KnCBe2NЙ J61>38>z)8w No+r.g|z݀Yϗv{?ݔd:셴[u|_b@-/oaVd7ߜ%pY,$s`.#"VB+/K7+?0&w;[:#+bGU֢OWT vljg?m2g& |y>|3pc17,C{k`WfXbhiy]1ګ)tߢRk曰hYI֐qA.B)wNRn>Hm8^"5u Mz#lǃ^y X=۷>r(w$׎Ŷn.#D{A!x?LFc]lA&] sQcOPiEYE@'z.R曫c@y3(j68y+9N*F<ڲ X9%Ǭ3 zƘv3v_=3<'^a 15N5fƳj̮G8 9¦Urs6,ct 5'yKtuD>Aq5RЀwq>(SfSE [_⾆2CyC˳Pwª*iiiɔ:Zzs?/6k NOu X k%A,hX?, ȿ,G97zQ=~a09:PqW`:m1 [y;xwwiyZZCka[`ߜȾR yAʂ`e F' aPuDU ?n^-ϬQ$L c>)iO+lґ;a}7UxL"R%-[NI>[+7_AO2>n33F=..' zDb eHч8rA&|r&s9x'SA tG;EW[o O5̗c`v1l$B6bN0&GfbԞ%#  'Ӂkb"l%x-G@|cTZ:h6VU[~ūiVTU?-.FUVN.d>~w'J:Xv'kM[uʽ~ۙZe%iR@JJ {*%c61悑^XuD&2)dQYlʖ{Uɨ)UCrAV]@’S`\KϹIHՖՖ_2F)'(㹲Pԕ%ϿçŇϲ|nヘof7.3!{ɝnC N YMޣɈ YѰ4)c(OElJmnd1@ˌY4vfnVsը\`k1XŁWm DEر,?A.J,>xlus3 ! $wY.ɅB jH1$ ! 6pi䴋#0Şd>kk_y'1+Clb+2U?X1$}Zy,Fq `, ?x:3CsUC\q1FDQ-H@j&f.)ޕ,BH}T_<%EE5>%F4)mmi4!N_WWW)ּ$@T3/^rvdK(,FIIǤRb""^&8<)x$IXҨY+ƙ4-vTCT7LG {Qi)]HoX#gG9+ f?," n8ɸW~, O($E*ss(dP"kEIRV;?fr!RK[%сIoBI6e܁ۤj5}Txz5J[H֨OVpKUY ܰe%&'2jEnG#J7&ɦ= oG.2&ʍA(}:fIk*dj90ENe }6ȡ§.Akb5{Q3{seF[\|R.voj TќxLYkCM:^a 6ؤ(xz\69e($T/0A!+Bqj "++"@*պ @k` Y!T|L{O<+1`h(P.DY*E& 6{8Rg-ךROmi.*`/L T!2k: CSlv?T~YOg8eCihlX^+u;ehPx:|\) Oo5-:::LI8:^S?6CjARdA=ZHŤ$;jx|a 6"Cy]|0eǵrN}2GphmrXE/9B"BAXD:υfB)jcAwjǑ@ REtA hw9;ĿNOmͲ,VQy@_ԋ$9t$ʄ[KKS\In!$ .] @H/uQNaP' vzI3qRddJ <(DU%ը i(U(Ѣ DEr{= n_y, &c%;g^+ º { K+ჲA &E)B%6\OYTe[n.ӴS&D-SAJ; JԨ!Dq.wfbZAjV Z"@lԆQA9H@9`9%$Dn1'pegB}s>fBvEZ_?VA|(QŷQbԽgg}Yiu;|Rc_kOTzjvիHH"c|ێ]|, Z?/5 jz~^swcHM6pK~n%r՛9+݉\OƽUF =Q2QzW1ڔWgo-IqzB1'rMTxıvz>eqT~Qzo=cKf+ˑe^.[~<{eƽQfZTg7/CSrH5o%T8{} bqK]l?F8 mzq! CE K&jzd!Fhuet )2nZ*,Jm '˙CS[c &[|@b>,/&gL@t C kAI1g?Y]Lh'^ńCD4IQr:pfXSile\k&rT<x(^&(AsEbs/%@¬m|+WwO8o,8s~ZN1K(>w@}= r RLCWncgVzT.v|Q.@]Pg_>t0~w٪[jm~kK}I^\`~)7b[XV 5Ū9gF"17RR)*TGT:F^8O΋[hap'oT"EȅH!)nb9u^ZX%,%(at\`R 3BDN Y!T nV<g=g_Yos{k5iy;lfh,nl\:V~_Noʝ`PJ#wR~Pݛ?Ql?JzG %+\u~$lQcayOduǁə3)Y(7?|8 yuE2ԬX{^0Q?)Bw,`78VcCp&,K0VYv$rIP&s FƆT0bM3e$8aͻ_s,Ay |2M8U6yRDԱ $uNgAՙO54ˑ?Z+zkK: TCK1:,6t.(1K.0;b;*pP&G@$SBe80(3Y`jDzM*bi #ʡ0Ec n͉B`f]7ȸ^Y+>O4 |fNrY U!438(JIaJQ1xΒj=kBnodY@;*!]gOdSlTq;Q+#%n%sd sigJ>`BB1 + z5av)F^&iu DNdCֶ en,{jsrK.V:xU`y|ёi01##P#-񴋭?H' p?~Elͬ:}fv ZŴ3lv#;FԗyC'];B9chPUĆf3_:P^ 'ކ' F_\wVi** 1$Hp+KTo |TJ-J.919&sJCo~1ipyYtwK;|6މn) B -⽂'r!Qy[iOLm]3a6q|_z%Ο7-:ѐ֤u~9lTK VP‚G`Ŵ+u tXOX;I9< O D0W*m핗-A$i.hU012΅H©I@$gSG'LIExj=9cZ2ÜJkqiG4J[gli&F4}Bt].H5߻[!M<ƜY0T@ uڔmJsMg, i+DH#h܋ed i@~GXHkQ`#(5t_Bb/چ@Q5=7/( E+'ۏࣅ'?ESG,ΜlKy? OT1 ) 8(,x-xB ?[3j=D\ttAK#MJnF}&zcP"F GKsbrNtFӃgK) # -:.ͣ;Dj6~k!#(cQgAmdōp( !0CFY&aȨ䴳=D@j8XV[\.i4^R /yL؆%1rKJ6(HZ=XjwQST*+ڝvҷ`u9o3Z]J[Qro3 ҷLuA*,ŨL2TRQWK!^g~?ڏZNRRWV]rVbU&wN\]!sWWJ֪רbLuɕp)*Sk̹+QѪנ~YZzn'Ŵgŕ,|VyQ@U& e]8ӼYmMTlb?N&d$5Wƃ~Qsi f6P>AH,{5og;Ysfֺv|DzCm0z0#XDk4Uܝj4YqR cU= Z<9BUP.P̨<1iY#8ߪ@}3?'i),f#L?~ud><șh褽 "+h`Jg@r#^$Si]U*~c 8򇨹oaи_ c9~W4C>!dҜ& DBAg,(%8Dp04dsuu7z͇~({-j$_ & *i=ܦE]dKGZ6 GљXSB2%c4\3[rUm \J5\"[ծU}~k̝sd,IZOwֺ2% &XNU̒-1I;״0j ]^u%'µѦd2ƱRD\>tRZ`&[iR=%$B"{9P'$35¦ZvX:5#L#5%g> I Y׀t:`ez͙u"LEt(.cPa2>CÄc,0*CCvc̞Xa?$(D!0+J*|v0s{*Y&v:X(/WľY Ѡ.}ݜ5AbT+yS3FS mm{"ד.(1#L}2:%|o6Z 4V¨jh)FPU |mRƗjR|^E"[f!G3+Tu݆X+dn2x(u@Ku@G (}vGֱ*M@:ξ- *` 8Df¼@ܢƂBąՒQ$ d!KE5,}b\kҼ.`cČJPBp_15lm<ˀ8n3.,dP?Q i #՜ 6YxYc QH7֤:>6.t]r-"$7):#)Ⱦ \{#Aˈ~tC.ǠmXQɃwK|Tbaz})`*H-"<ыv *B`@XJ Xt#!C[9DhRqntNԃ`(.  .M(zDG〔PK`}vAiTcτP> 4]m:Tv5?K;Z10bZ0pt' Z FZW#AF<^uKj*=mT1\1m洁6[ܢń<ޙi,N)3 Iѕ>b4fTɡELuPDnK?g7{b3$NqyG*F='5 y>J k(h>y%$/J(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@/W "sR|@0שgV%JJB%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J Q*-<'%PJڦ@2wD%-QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(>[%r@0](?%аSW +}%KTY\%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J Q(o71޹>xj:F^߮nmͭ/x}~|,+\h|K0>0֒}%X^pé̵:8sݯ{ʷj.tefyx~SB^~\g5Th0S&ȭXr)1}9&eAؠpϵ:pאh](eKE=ƅQt=?5طB.x:^ܶ^߿jcypj9V*b4&7pW]ȥ '>NU֔ Cc]Rσׅ$(%X}=RY=!E!jǔ붶 y1mt_Nqpfu5҇ǒSQ%z=QaHgvpfdyA/|Dl&~:30|WRGp}J99E1Oe&o˅ouS_o~vOㆀln o^~­i~~oWwX=#AHy?9Sr#{o[ |O3t`^ NV39$?KU3s =eiwp &ڋ~u)=\M <;;ϺV#L T'U!b[j9b [=6~SV"l1sggf@]t[D.' i^6CeZ+l/*5ik7b"$E#)d \WQ?q[jk}U8R*VSuJ)Z"0$Qt:qCI“ɻ].}syXr2_}J}W][~\]iֺ:Q L6A`ʳM>*B*hb-JXp,7Սvr:bb/vVܙ_Hg Qn|6x0P8*O]vHmZ Z){kKK-"O~0ѧO% &_\8.e_cT96GI,"cj.`oE(Ōb7Kz,{06O%;/jt4V·eu|pɴd]GoΧW^rz7\({3z\:n$Ʀ 7#Rfʁ?+#bJFraxaS"iZ9!ER""ërz5UuuU9?O&i6ы&#N,SڠT=DmX8F_)tzUc2\a؛¸dAPA^onxɕ{W-xkn8z`D? >[f 8Kw?]ж=wpa6rD%lV؜`WP[,ѥMiNDh TG95G\ -cсOX;qSX/_b҅VJUfvv1U+c].HLLr<]UT3@c<+j$YV Wvtq0K*i!3]/ k^"wPDPOaMGϖ똈 0\@, U(%bA%zi;Ƒ4tGGHڞo(%hHZ*{S;bIm123#4,(#ϛ3lMllBFn>r#ztT[1h&Ov*6*g~;L8?}ݒvYm|msLQn»O=O9[wtl+wW"7V]6I@hi;U^ńګFWϟF[&ٞbmVF >b'oJ0067Az~]x<##Էn؛p?u7p zCƜ5g?kjiguǼeE?nycfpmN#vH9Bx:q&|J{;Tb^XYm+|JeM7.hnOg6] iUb~6UUSfAurY ,Rg2D'(u;&k/%%DSPd]jv˰q$$ Ȫ=푞ʟKav@&MגD FlSYlx~\ur.$sdv sų$EO,XPP<:xˆw^9N2ۈzI v;݉dms$͠v꜏N˝[ wnK |2_<-LSz /.=2 Xޒ,<:snؑtIs#bmf3$5kS9`=cJ)k6i2yfSwBcȁN:/j̆f<PUv;L w ]ÕJ{. :®5Bȭt 3z]RV#TyKM񜘜\og㍽*K.g8^YQya&+BJzݥڟ(V^x4$arL]L)h;D#)`1BH !Ўa꺘f]L{t:Gz޲,.ZR[d!b!1R)1Vhi!C:`[M) 5 k Fz cK"g9QY}`hY5SEWQ1*d*{*p`{c0  (r*uPH_A%@4|&3sKFǤ"9Ҍ;$$" (O qNJqo0QmI%yº ;!hAW2xE|Дa=QBj;i'ai:n+Ӵ q% ^ɐX,0B#J1H},adWt'Hyin!&V(A ˄ש1,PI Ohe$D? "W#QܽOaǟz?))a3LQb_?^󴓧tfGctaax}*R _5W,% !շ~3C߻ O@Yzz?e6,^sҽcݣ1p٣l͒GU/=$SGAf\YF$Bk`_$N! 1&wbwݥAIc?<4Ѥ |"1뾙"iuER8cWXob}`)L5FW7R$VwW,ý<]S)Wao%P0 5:mqkrT\_gߚu- eNSP=S1[Z>cSjDҾk?>~2AkQ 75ֿ}ѤU9 ֿ76Z7NQeW؏ 2 f O`Ӌ[  psLt ôc-MFhrH' vZxs|r "٨[6WxK2SII 8>yk?E%>#?n5Q`3Mp JJD@UtJcYOO=TlT?P'E!~>)U 9;>?~W>LaV0r\jg)yQ2}䢬w(P4`%g]["g |Xc4 >RdPt`*'f2j[{p?<-̶s@nhs}XTmՓlTk}lfW{u;7g0_=:ޞJ{i+ W=a2'u<}<5_>〥gTG7elom 1"M!~gP\h̽ӠYuκ~g+K5 YmV} IJ'G,ZUుxXbӚi ],i诡Dn(DRV{okz./G#tT yXiJėghCp!:]0oYa)Ιw𳃟> |՝J`nrM狔4ڴ@٥zV~t{B,6IBY&$̱ -hDc(,"XcEEݯ<y*է:Z!+gQi7aZ|I2G+R[0PXe&zGSy@տQd ,ki~>XeCKh X1F: lp6HF7gSysQ  KmKus}uqvz;L6d|6 {SL#Xt .\zw?{WMkn8z$'_v;F?wpaoP>LU`p_ ^-l ޷ii RGCZKrs$]($T#"7pAR[_.K" *U$^'8Z,y؋ bHlE2o)gЉљ%GX}Vw߿u$wPG噳J:{.<"O"T DH%8SqZ8< !۠AvyDK!oת]+guP{v͉JQzXA/>&Pw+xdž4B4$zuL596]jҶn*҂(gh[|0 2{Ԅ׼7wr AHRVa S띱Vc&FMFSyUFh_t[ؐȿZRY 1  m5^1%Q"DAt.Jj!VeuRCRT ]QQ8Rw$*Ű[RNj6rhTIf<)٪meKo6N-kX|`ONMߵSnm \T55+gy~)5j8 ʵ^i+!)4 \Qr\!|T`޳jFd>Z7`;(&zSNqNG%C:̌ ^`A Ɯd֌ٮ7TBWr£9E٦)`/V˧5ֿz@a_NrCXTaYDDm#'1JG xI,F)IOP& f/^Fpґ0BDNgs]c/8碵qǹhminWGLTzRD2k뜴B3FPD#(F.qVԂ2ČV+AI *$!5F@=rTirFv}X9ԤE#f}5"ͬi;CYiQnkb4!T)DXҠ'0$fXrD[6.:TIZ'ig/i-aGysO4W*jgUwgm(|.DdLpE/3,ω\/sص/+?KV`ŨD.RJT*ѩw_ \.E]%j:wuSWP]i.$gL\J䊋qڇ+N]}5 q->z]\ybu*jQWܜA u;ucAS`Ũ+ Wr|)  /z? MKO}FޟC:,޵6EOGe  ^pfO[,i$y{,nY[4KHVXM`pPMe:|C-z[9 hDn@NjM0exaFp9sճ8_za|Pc#/ hr:ilTRdU"1A]4?>ӳj>o;7rV;wj%2=}(+ͽ`-1pdW&_NZlM_ljO(/C5&ef'2UtE{Pb)Q5b_5OK e$KW$L:_wH]Ou42QǸָDphiJ#AEj3Wf\շ7eUy6Ʀm9T'sfío.L)U<3(hzQ×<|)_×<|)_×<|)TRfT&XNfT&אSI5`Rʏ= SJfԛL5P4*,h͢Y4fQ, EA(h͢+h͢Y44fQ, EA(lA(h͢Y/hEҊ4fQ, ś{D2":O$Ci Y$C5O{|i*AuPX96qsI%/:L*WyLGEK{Z_Wk&v㩖,-NhDT(:B2OYKAR/5QNyiԑh\bS,T9TF}G2EGsI^_ZɓFOHyKD(ꔧ W>SSWXky<.Eb3ۘUeK-+eG[d>'mK?_˰<C{S~lنj7uҽr̖u,Ǵ߯%KY U o _LzRR! ) 4;QV1΃rDb#A'ruGZoy̵ PE1& ܗEq(@$(g "۔?09*LGSv^LIg>l]7ZxrGFҦe6Bν^_>e[Wغn][os8}e37f!墔n=5~lWZnhŻ1o[*'MOtl 1/zCFؒ5_46 ܵy7A;9ڔ]7՞[+89!tR$XS8tL.SA'ԪG'TQI :)Z22{?/?Nj9;%yߡ]l1K9sK;3@w<㵒k.FĢ7ΔQk=7DR,IsK&Kec!pO mۉ c"UBIF='ĻrHFiwoP/ gqL^A5]^f[Y_6[-f^jp_̮BoM󷅀=4Y$ENx1q]6 &eF2 ,Z#ՁHO^+Z/gw-'MdzE'29zڵ~Typ|ߏQ`V"7EnRnĥTL$1&(ř@Q $Rr}9K Ⱥg*毶.sw- jYӑvr yA ?gZZM\\wG$f g]$E,XE(^Y(L<^]J"Y*k&pY=i.,Mǫ8.wLxvK\oyph~nJ*-t4Օ'f\xg~"O~ӂ<'ύ=3{&I\^d۶LV[/;6ad&e:g&)<ǁe gNҊDtakW뫙M͊.o|7VԴD'tp6a4BB[Mʪ'(Ɛ 5 xL),Q.@Q)U0Jw[|oρ.1' N3ةf Ǘ,nͦ߫{lW)US-ܲFJ)Nj1Sh1ZK-.*2|2r?^VA|Ngy!.Fbӟꓚ5OpR8uV݌A\Uh@-_sg|cCtFnΒǣ:<IЫIXfukkLr?-s1B]YoSDܽ{?&,笿΍4[h< SǞ |X_mj_pF5V9F%cWcrvdfufxΦ3)7gq`O-IqzBDB0wXcnPa]Rϻ_F6ښgz_[ԓoqqefif-Z5du6CSrH5<_ɯQZ.8]}m^p[#dIt:B]̀A. Oj3>0m1?gK@jh%^ Se2ɵT%uY -|Թ28,g =~&}چBW~g gLQ0\F(Ԡ@`ZF.\.(V^k_+穝U_gd "+/D}ѩQv{a—ᵮUͪT2^%_jvoXUs+|yruz9w`*b1Uya2ĻJ% Ŗ-hMHt4z4uthZ93 +JH%!6R0Jgh?"D݆Ow*+ ͫcb}`MloS!{ڧNl`-v$e,J,[8z̠y3k1dJʑ$d  `C$nH jkC;it*Řk۱wF cg@^wU\7a>S]ȱaSzKy2X\%O5Skc"@xz}XOthIkvy"$%s*@[B'9UjM3UxwWhlWk4LN,lcx |o22M+;y CqՁѰbZs4u" ;SLX+mJcĆR݇u@vm,dP93 yݤ(G,(Adw^;u}>: K#'"`2ф*CXh8)')Nl~T5`)sbD~6AH%-idQlT\j̽ӠYuzޣL=Wj,ϭ2440%N&ZU`Kb0GYP H~&-:S㭑Q.Od==<$9TBkr2ٕ}ٕK@5AA!+ڙmԜy0M;zX'e/;ݔ(LQ* BkK4 CR0F_؎lw=x ް؀À2zѬ{G\x3GPM7:U1I7JD8ɱ`D+UA>(A)K/?_|MQ0/żʷ  gƓ^?_Fsf^dB$ږf;b5* |Z`hW/%}ӏESЎ:UkҵлO>jo81;<B%G.BVw( GMEDA qPxEMH @,>ou9}bg1ǔ20J1g$GHy\;,$!"AД00z jٰz;i%&6Z*Bd+H~`R$ēh2r$JzEpUl4Bmo.YaXs#RL`ac5NJ̤פ)ыs[[*FK\/|*3-*NtȎJ!5b!Pg֛eJͧ$xOAŢg|cOnυ;Qȹqǫ:T%F;&eݤl5Hs lɒ̒I;2LiR{4 w!vPQ{BՌV]TÁ KiZF?H5Oz.%=~KI-'=n;A8.)@? iP+h L+G1R!!$NS' 5G8 u'T +U0V e8AWEc 6 ,kCΌmV?{hb~Hثo\ٺB`lTc 笔.NRD 03L9b40C$(n <g\VD3'n Dpcou͂'^._!Mr㹱;"xʱ'cݯw_0Ib•,RjDhǰ@LwTeGg^UYǫpbTi E#<25L|ʒӍǚ  ^4fy]^o= %sϨ.sݳq(P(zaQL2p=80=W8AENŮ(MbpǚR`aS{^t:{fǬ]Q AYĝ+⃦T x\b(nGRM]+=BH` TʈArg4${FH{!LP`^ZH@',^3b%,PI OH%axops@^!>[DqĚX*g FU:u UƲ\kD~|-,]loj.tK}Rb 6; !S-m]bv3wYbNr~υK}Ḑ6 IER$M1oVȔycޣ4xH,t9 1RW 1LH؞;Z>FIU)D*z7"B$Fbݓ)boWw$'$ ltևFBWtq-E"uwa*E5rsl?UÀQP& -,I6 뻅0ׁ{ۻĔa4k6?o{lRY&DD>SEh[$3V:j-?OM*8_C`+u0Ptj{D-Cv[L>R i tzt}uR#QtSw>9eվR%}t5DP%Q3 ,tFCe+ fOEyw (wDl `8\%VHKr!+<5yYdϩw]t$xxܻu@Hj€{>5OpyF+<-J#yjr R'yڗ]'}]R+R[2E%AZl_nxƼ;@5`߼&nuխăv |]qc|MtNo6q+Ã6Tcm+NƳWnxzMby<N,b=Sgߧ범~8E673)^dIئ~}sfoZy&U}ڂGOxSY'.c޶'VkfAsn\ % v_UO.6=d%&$`R Ba1&74U87fu.HoZ 8yO*&1a!9̐z1ͮ#DU JY~o(@wko` \Mo ,/f*OadnQ-Êf<-\v,(*GrqSCTiOU(BKdHp$>HU$¬EبV:wy˖ztI}LdYg>Vvgˬ0}J!1B@6zz!U$㑅H>H=0+Cʘ69¸P0.}?.dwN>Á6t0mw|Zni{mӧIwcir9}fI|Y ,Rg2DOW]LDX,-,xGI4 GrMR=2|rgu0[!&nCTpb-h;ax-}0g6=_n3{tbY.OX=S<1Sl)yiIw^9NR(6Y2!Ov1ֽ;')rKW[/ ۿ29hu1oY Ъ+[gp `2|nYllyHmy )D%Πܨ2i3q$84V̻x4JK: ":׾].YGbPT8y3d8}|y޵}0\ CZ#/0xJ' 9g)eBK~c`͟SMlWMt< a^tɬdV+BJjOmuV0$NP~~S~/}BZ|ٗ8[__k} dחu[;yuұGZ'AZPqs#&w<Lfª a]´dby sR 5(o+LQA8xf+[p 3M N0p& SN: Ry07圳Q1G EέR01 Ƹ#aREbB"seU)tN-~Jum{JS{ـN4W5Y1yY`  xHd2jX%9UůȪO hÍ,}hzپt !RLyf7RMJfzY4[^xX8)cK4a$Ϣ!RME+ ,C7k%rvZ!r${oO Zپb bBruKDl8S38ƓL7Yn-}op2;ɛx77}¡KɷD\c4w '.58~u?i=Ͷ\ƨLox;fu,p ]ҁ/n<[<,kvmf_{dxo5dyfn\ogOi_ww޵,/0COFә}_0;B9%}pكoB3Ovs`xp;3ܨ)-#.mDq$FF"(k\-hм?lGES܂!F Z%*Xh`cG燹N(VIr&$X7#vf b>tqoBl& 6RO"hϒ:`Yͭj#bdrN*IK-V$ɲER/Q.P.+3&'I֌ɠxHO\L(zk^8ԳർwU6ExL1;? aYD]JG6'|* I DS1DF57t5+>W}3Ϸ՘&h5& @ F"9"pcOOlQSy9/áeύ|qS2QotɨR 3øVY4uR"$^.XeVYcz{ {.i9{JW<O gص*GmoPk+sv?ۗ5.΀Wi#?yU0ɴU5pf!d c"|M0VA|= Ʋ@GLtsa|mnk{t٣Һh6y4vcr t L } {Kc(OJg9GGcC#'2A-Q[£b" 2430#t DT_:M #ABH6L"8hn4#2z҄Aj1q(\Mڲ6BaU7ؿɹ&n}Eosi&]q>V^߶l:WRY+o_ɀ bTXb"!koyR(iH41ދzFet1yOQwp@0HF+UQ*KJ{blQQ|a1x/de}A~w{ng7۵9{oХ=>_ӻ"\iIx5%I$¬J*jOwyF O-NC\d{8n+Jۨ{ĄbH$hS08=Jpy*^vfkWIh&BX@eQVICևBiSp&"$vIFȈ (:pk"$ /(|L2A5hW:q8awSc<XL?yaȫGqc&@Ij΅ Z)DEKJЀF@Z1=%ZHW҅nPBڤ4g|`BH 15 {b쐒>⾻ED%՗?zFD)ں&s]nTJgLT#qXyǡ[<:c` B c8G԰&n<~ %6FO\'u<ďe?2lrFjF3`5&Y%1 dh8wolY֡qntύDܞQ51iwz:vvY}:(QޖFx.WΫ Y$o`tznuK4$gݡ׺DvMfT /yZ{U"Rm\JU*VڪT[jRm(5h 4^24R^ݥk'AXgL#Q *R SV1j%8. kHp4ךXx#=K"8`!QT `A*o+ 9n %`{Ђo[1 B2*ZSG`_̡8;/QO?ja>ZrJit'at &|E [+:::2] /fkley_H)"%1 bIRbTb2ځaRZŠV8I =m&0gqPc󳐜jQ[YrqK)88)ߞLp)xR0*Y!o/2MCQ-݄K[%$ iLGiSUVnEe@IE T1y7/< 8! :!-hz(׏E÷rkc= L_ ]BWq[?}8ٻf42XY?^׳+ VvcF f/jK4q*HJR SjSr{T/S*t8ˁZZJS ,mЦS2a U[k4O=qVƽ_J?'}0_ynuL#f߬? *aPHPOYÁLUa9i)mDR1F8+'9'R?D/" i6~R$L(),3q=,'zJ'sߏl?/Eqg^5-]w]f^ Qij7[s7f 2m n?nf7w9ntf{l;Ė ew;r>L nϼt ۚyvo8:853g WzCB5?okjΉOmbWωOTsZBFn56?RlnО6lڳ֞ C;JA3g)9 ?!Cw/%?V%vJTXXQH5NH O$(HM!}Ӆa3_]wA[kf]HHM>G?d|yL0ϣR 2 H%傩DU p&q.[}u'gx{V70ۼp{s}圍=v;?\N+ V; fŗUvsvCH\D"X+ F9hJ)ZS) ΒV۫<4et~˟‡5WlZbPDy 5 3raw>1Wi'Oۄ)4lg.dٶ65zlV)yK=-ae͒\cMX=3E-p`cu{Knz©~ҹN?w&5J,|AZ畗uR 3$by*dN1d:D,a[="% &d$  DڮH9[/ʫ 5rGEwZY,'6lRsR|,&HJ $/_uzPՄ )O9;9IʍFa9@4XR5$*`@?qcPĸ8bַ+۰D@t$"TL'+:)]FҒr'y$ YG~Er=?iFMĜT%FdIIc2w&Qubyr̻EظـY~*|o{ p<;a8}dauEv⤤Snɷ*{}Lp%u5bFV~2ϡ}:M|}W]>2#s:(\n&*]X/ou% pkǡ'U5E?+3^[ԓ~e8?20f4;u🷇fV[ZfrN~PYಫk{-},wg?"(y#%_O}UYB.%/d7;|lBKM ف$[Z'SK7rOB' CҠ:k":pp  YPQ'&VTZ!&ZKY~A2cvU4ʠKݣ.tB?Ym?M TOBkBA2Ct < UAT,##ۣ/ګ@#k҅EAamNѐGϵ^`: fTΥ`Mye.k(IJRs8ј*rC"[rSbuZG08ZBTB<ZƁ:\gn>Xlnks A-gC4 R6~@$d*%W$:NWשׂwvΒ ΋J`u   r! lniPlkf>zn5H7]g(ClTٽ >:BV#M糢d-ORs%Ctg1T*hEg?;yS{1Df\k(Q˘i'2JUdy_gIY1Sms ݮSUf5IG>RVRl` ,5#yVܣTXHI8.2:+N Q6":rS%Q#[t*z]CQX^1BeY#-Kpk<,WDy]W=n*zOT]Wt yj&,n#]Me.pZ}-AqURP)PgJSs*ߐ,^FZ|1) Z k\PZ#c<ˀ_ ΑiLb:H&vNW.Vl(Ѕ۾ҽL{a#zSjWLZ 8njhA_24~D-wd3vDq1oyۇ~Tߚu{ >d6km^iYF^Ϛ*ZJN#NHǜo7NbKZw'{Oy OZ5}`8WUGc F+T:љ7hqf=_RLؗ}@ӊ<ǭ"өL0ي3Y,9o}c*p?Q 51gnXV(YQ황 Q={d* |Zg$`6 JWҕpr쑴 O "\PWw`GJAT CH<T ~,5c$4E)*p#\Ao xm$mߖioz@t9k%95ٗˌk.,%#'T \SŝgD1iGU#=S~bR:Zba\RaNr[܊P͸D: ,1-Ԓ? #DΞV,h͍Ab6i$] B\nWo>Rhb[nOm 3Ec:Ffsh(8ٲFΖrvp+fd^J…d72b'YsR:̡qE'mgz =9"q)!!gܪ(l)BV.zeR!i;MVkP+NI+ދw4Z"fK t"E93\5m8ZQU5x%^CeX0ș@)Nb{I*kPj `f&fR*IdIN!J0QBOEZc mY϶H^'kNc(cdJ6eXYO/E. lK`N<pt}(tMЫI1IU7fM^C H.Ħlzl;TN%3 7}P^sr*VJPנuRb{fB\9omTEW0XdyElu.[ǟl13ȤFxV\*!:\UT&`c09;(l+Qڞh-\65WLף-\zc,1Nvo¤ LsHrl13t=w=M>y϶Dn$m21Fm.~S"~b..S*j]Qzuմ=M`@6_,8uԲՌ[vÝϜ-z^jJnϹ6.Ҩ7-Xs>ymv~{9OT?yy}:vˊ.ynV m%g}|u{{[dVu-fwbHSoRMʯ|Q?D7ɇ @i#*fmHLI:s [&#gvX^NI-cCКYI_vaAy4CڵDe߽jYetrsr dcJ`0ē`΁dDI ) S"{>6N@V>s~aޕ,"ر,x$Ek,LSZR6wO(ȦHmXpf_U/^]ӭH*EGlDfXsS/+g>M |jw'$ f #ϺI pdjBBqj2//,av)DԳLjm< /oOd%k[CtiGr-/K fue':90M< qIKev;3xe&bOف&i6OZq"grq2:&m17JLdXmG7?ųy.Tz` 5IC_ٓ<'ʛAQp88['zc]R422vXҖ &LmH#c#u Lb&oZW㍝ /g8OxʨuqGBI:)^Ygϩ|XjIڎ,: :vQtgi#5fWWNf~vcQKc'0g?obwy1{T`/b e`" E9SfC L(%lkGB$Ѓ,)-a=IB9CF>$C2ڃ=jenOhJݷ9c`XZo>o$EI:d *[!>*, Ų6A^ ]Nr=B3.sws%sc[R5$*`@ovOcPĸ8iӧ ߮nQёLS1HtJH}OqwGa(6n)Ӵ ) 097K,dLFxD(.0:,_kTWvɀyNpKH@&JvSQ{̤/9>PKaNցN&Vg {>gs622)hR} .F%u>*臯NK뛫iJfFN?O}'81HBW̏'3|^;qg5=gf}z|9kz c iF|$TKoή~db1><;?\~"_ߋ0駮,||FJ8qRc{nɷ*>&8Nu_]X= ~û)u}V3@ G0xdNe"3WJwtRGl.dzǕ'Uy{f+/Cak}?!e0G͸il_oX J=CnO̓ }\ʱeH=k_clzwIaE.}aVcd랷j텧7OHNɖI/ĐHe!|:/-%#(}1>yi?]/>"?Y ց`88eȂӧ 5Zx@Zz[~c~=۵p#(ǯ,6u4}ZĬ%/a2n 87_ 3ms@ !߿O6ӣ(Iz|fnP"#W|?Q pXjp`?*b2&DY HF,o8ޝOջf66;g$HE١˘lSB}廹}֗Ef~n?}oΦ'?ix{1/0Qgnx4*sٟ~aV/p;-:GSߟ7]ݎ^^ ?_}I&s kb'qÑ&J1T08Y(X;km"C_hS:.UW#gs$: -75׍9T6:lK`Ԏ䩁.C4|_H)-"߯eu㳑Dd:%W$:hPeTͨJERzRFnjDƺlrK ч@ݒ~%~PPٚ,KQմH | >< -mFTgzNz[ES&.`g.X:+6y qx4i Bg.s>woCk,l~G[I %Ã'R)ir1c2i Il- Cmrd&2H:/"0D%5 EvU~WЋc`TZ:J+;,x1sɄ Oz߮\W㓝/C7jWe! S+O9۷L%p5 VbL PFP%c61悑^XuD&2)dQYlʖ{Uͨ)U]CrAcID0)܋%]s>ѕ5c5r6k(8#DjqƮPՅ7]xP]m Nw1̓&w8cßs_^XcgB ;܆ g2N2KD=*[V N `2!EQR!hd2{̺,!3֮EfFsy*ZwMk7v1dg<:$4IYE,gL H6\ )DKF턾\+51u˽"A^ U̙JVT(cPsRVP=#c-ζo`|j{vhÍ^jTǫ$H~STL;EJRȐuXVeYsfN}>Tێ%j3Op]vK6ȄGrepWvUZD#XfJqASAҗL0QJA[]@cS gpFrX{9[$s=8]T[}d|EY!t;)hnW\d{KvE lBo*W8V1b  b 5WLSJGK,K 8*[+V܎Px'oO}HL,$#$8׊1K, Ħ{Ђo3p@ɨhL9o||!.@z#u-npެ}M9Q[P.u|զ=zɸIv^łKXU:z%td収%b^ť/_L$hԳI*L(7%}B '&Ơ<.&8ĝ2X*fv=Q)EԺXv yTJpB&S#ģHB|ns3DbK1ĻP'׵Rڧs1ͥԦȏټfd[۬k~oCI[2 VKă:L E$'dt4ELON"h'E"d^޷ N0'N9A-(Y PZjB;@Yh+Д ԕs.SvPHGYoyJNɇFg{eM[Xv Exj_25nWHdD +Ag9*sJF O3c'z֊N5ɣ^BJZ1NqGH5˜͞X`RuQ+ b!;SDd rMg'8N^YrGsd,%JQ0 qjql[Vc[k*Xfw%CCۈz(*mvaY/y?32ifo]H"3JaŸ>"&)PK(0 :1^J#}jv}H <%S~h qd[<: &ylޔ$ ycY2^zm~ o{f Kʓ'=oo_s;37keGwaY IPGJ4YsESPqt !$IzAH/_uP!-4HJpS+\ZZK2]+0(D2V`T yh{ACcP 3E(1bh. X\,EJF).FOq)$-ZcWa0tRi$BhB8(RZFX`'%:QhI#DDq`3\+[HeңPAgM G(P)/Ty-n%PD(=K Kp ʤ Jz<,I%˘*?~fwV8;@Л U0;nG1N87NdzQYSt!|%ؼjTk$2~a]5Rֻۦ5r]zq8gOh"!-O%?G/$SOL,=ޚ4!7sظfЂ:Yo iM PzZw"7h:vx}fI-LS{3')"8m#ʱ19>*z8v)\t:j6PS@x*/u rxy% g P/#jf˓~gj\_qzefifZܔ??W?MC?:7<Mݳf ^P%ngXK7F Lu}>>Kg.aֹczV Y9rrb>^n|uwLqt C:WEϓ K+CG'/ {-SI V paF ʼҞ1Ko cJA'ZO-Oݫ@b}5m>{Y7> X[|Orr o?[q̬ƼAS /lC-YDmFW lTJZgT(:T\YHAH[ uVSHG+"U"[y-B,|`:L60Z:\tUg9PBKk pZ)Cb*!"锨G#,Jrk)v88k-ǽĵ>Kz e>Ξ5d}Az-4|^ i-.xRA& FjJ+Q3ΣDvi7aۣF*="$DVFkLDKM8B$)gBI)8Md*b zAyo@Y"1Z("Jw;+U8 9;:5y?6ep6kBtnr d|=ab:5)VwM6:ֈź5l얮[-{ףktk[Dn$ɍ.9.2ۜq03BphEul4wzi\"+祖a2mo~w{ɖwyvٯnqtpٜ|l=Y9y]ק5%f ]'C>g1Z-e˭_6\ -IgGˊ$JhH9pVUTo.7H~wu.&̮0kخ}×`r`/ʶCZ,%r)$4ҚiJC ;B&CQ}ޕ$EOY`_!)xHjJ<)Af{_UQW{~hOW}uF WT}y;)w j}j_L >cj0#- PtCg>{ʁp>Zpȵllq4%Qy¹"}gN*Ш$A [K2jeA`_&eME)P)hiM>^U lDZG4[Ȟ*CO;|B@ޚzc2*5.t:)7]y5A?Q^?m/*ŵcBf7@#W G41 36܎Qʹzs<,*hYkHf%AQS($ƫT5Ո|M,*Z9 j+GzMJ#ΘHYg=_ج']ʗaƅ7Q:ٜQwċ% ,\2$4/J3PT do9Pp*K!`QUNPЧN#dLQ*``3iZ͠V;inh#+[0 XT -!a*%ҐrwO3__=Qll-4}_gr{?| R{|O!jR¤GU R2pYs:$(5200\9dK6ev^biv*׌D|1I`s"8F ZzkUj[2ؕf`vqnw4S_ z=5v}wE{Nhl(ڏgϪ;k4UlFFuU~oWExKOo{jXVR,L'?_m5;靑5޼]};.$udzOR'3y ^yRV}d 6gT}*4%gR~ҧ: UHgOj-^ZǻOg5EIǶU %<ՕxwnM%.f9Y|~Gc-~;#(3jI>ʐHx&amY.^Gs2%M -Rg\_g'W \G7_WLJ;If>U:`Ԓcski@ˇ_G+W}֙M\ʩ֨e^V>f|uzw€qѥзI`c^ux| Qd'A3(AІO.ho^G:)Eg.hr6qU~c_}|槯g 9ǟշ=ų}irmԵDx@|ٶjpUʙm݂Zb' sMnDbTz!H[R]izGeua( w9o8 # $|ٸc*zRCnToSm%׋n5@,zGIc1NTKg<]Z فҠ,tz"JqoqmOٮ6>sџ]l ՛j0qSmXze|9[=gݼ~7*}K9};X笇K5g>|˫y?^|qUE\2}>¥ n=ue+jn£;ᅫwOϷs23͘w~BnIO4/f^;[]Ϲ!4h[Ľ螉s`|d4 ^]J!WC~U՝1Y矯z0{ay7yq)qr\/Jt _'k9F_|a KfJ];Quth ZKZAx^=h4%ȋ\t0 :mTlD)ད>w1]e PLl=QI,L."|cD虘ƐR:!>?XMX1=^WvO~Uzr|zw|0O?aS'DjE~%PՈP1b/ y`R"(s.嗲 ]Z;Ϛ-g:nq.[ڹYOJg̀^R=_ 煠v'OfR*kW"6: Nբ.vlK̮a:CQߡ.j!2Fs!/&plJT9DяL P(6A, JSvāκZC$_|CMgIdp >BZѯō 剷JԧWZ-nT7!]m^Ɨߞ yY(}2̸(QtL "OQykyOZ6IJ@;}-YߗhO$SΞ, *7VJ.J U֪;s54yaM!C0RbY\T[ٽEyx KءEBgP=gɗ 8/p&^5T2&VfKB (YcEb! vlMt`oO&`o2;w?4O fH]>lKNB?VB]6!L?!y;cklwޭfٸ'AywUcp={/~;78͙oV}l-FC^pz8o6DiօNMk}`eȾ *}ZU.\'J<#y {2;S^3y\oG) F(G:bbɮtdJ ^ouQ'zeϭ{r3֒SyKJ :[Ubq #ȺKL>*bSw@8>ҲQ-_T6pJF#K;L&Jf2f4bì*oV\zbvz%r˷? ur[ ~tʃ3-+7xR+20 f!|t:(uu6$A1)2Zt%@xߡb衄 ze+RA52602g)4T,Tmu#>SK=9%]]\RnO;>O j@nnn7؅Hi>cc@bc1tM:k<%Qڦ@!+cTZU% r]awO/ <u d+q#vNs9nCAm5`i| d"'&_hO:[LB1%Ɨ`.)AʴeXRD#=df؋NcMfdQabR1F!bHT CdsNeơ b38ucD#"sF"eVd!oA&6s#VzDd&Y.ȃ|y_Uᨸ1!0<rYzάgUe3.  ,KT]6w"eI1  *U JQ26d3FœHCZTpơS&$Xee{,fiOM-2*mr1CU 7JPȭ9L׸B9d0۱jAճzUܐ_՞w-_,Ueëy8})Lj&7Z=~\ }Ng݌ގ:0~:4$0?gݘJYC[noUeuMuT-kl*Bj]<':CY+^++gEb"Ձ@K 3j\|v]eSjsGJI.yM=WЀ:}) 7KKyѠqAp[!/R^ \Uiڏp•WU& .zdWoȥ_3^W l+uuizxzV 'puKF **WUZġUrW?\)T l.bJrpU#\=^σU/Uvd9;bZs%>1AљJg$2U5:DQS`{ᆜa%E%)DY9j53T%>bUl8 UI A W0s ¤.d9I)PHJHZg(zv/r,qew'.zL#Z,v6qe’P*uxO9\;_GL:>|ίX/3˭yU6 Jɀu@c$ 荱 YـD=Zn3tYhgﲰh%bdVfFks2j*ej9*dDS; <͈" SAF{U sa+hcKյ :~C/󺱦ptt!ݦ߆^u7"eI$j& a 6y`'`'%u( L9l{FDX.`No׋W_&]wkK4^^oI6ywkKqZ|CH]zvmGک0_n#]՝շ4Z}cϴD\]6?%w}KZN>M&=0zc?Y}<bvOhnfIm#r*yP[ڡ]:ދݻ(*()~T 9Tu-] dI9g- ˜ xI: ئK#x!ޣ%JȀH 6"kB'9 ٻ"u5Eq K3qǰ,OWNݘǦ-[C=hƣ9J4 NuVdǚ?\~mnZx.P貏U)q]x-Dz':"yc?ulzsC}L' ڟLDE.:X4C2ANV[UM"[1)ཕ>9 TZИ@90h1S(:;R"*$faO&s,Φ_nJ:0'&o생ӄ9b3n?ktϻMwu_n}Zt2ƥZhE#̞\dV&(+HHX9"³ Bgl6dm .b𺀒31ZW&%'%rKrJA6 2+=g2lZY].Jf.2$sDúr@yW뜙|xIC'jb8|/Q=\=75s?mͮ˟L40&0 YR:OZ2d Zk)HfG, LJx- }p߭bs}2~PCL<9+gaAI?q3Fng_צ8+X|f1›֞tT$xL>>.y7_:hO֝g8B]ca;${rAA Z֚&e,!)Q[n#ᰦ_SU_]2AG5W嗫[/"Dm\>ր 2Qt^duq/trFw" Md'Vה_{J{^P|3lD D gJ`4g8͘ fbRZ_3 ^AI}T~9P~z;$元Ǫ!gp38J%L\[sgD=8;kk>$,d2TFyeg07LP񲝊ǐh#f;]֭w5%ic{ #}֖(:H!td%3\*VU9ʜ2c\ݕ`&Ԏ TNcp%h --1YDPՖ=e04@qݣ)rȖMrH=!oF>IyľbW@ EqZno +ސ]6\..FO}-f8CBџj1x_.*4? ~5yN靈ն",goL#e;q^j0ܑoUK {ydM2!^|xc߭έuaP4Q301 &j :b YޖIEܼs4TB?Jsz\ՍCf1uVi%/?-I?<|*}]9tpUľ-czUGSZӻw0.;] m1)FU rw#5E]M OhZ&0 1{%i-C#ӚkR9Hn1@!&L'OMwO?H&m W^[ZAE_NZ8RG0Yp8cm?~ Ǜa .{~K篒[3n$4x|({ޅ1 O ɖ2o:T`!e;`ıZABeX|N&;?Y*8F*8^`ɣ&䠕 ʁsZDL:KStRPe.k>`bYH 8٘i;YhT&G̹]E!Qtワ X?sX-VӞF}XnN}d~wHٔzI|+P N ٩b:Di*uMn*uEF}IGN=V/&GBtDp Z-WCl, $.!UP`޿zj5I>&"K4RF1b:e/-1 lW{?+OZDS).0N H~6@ #( p{!Ϡ6<^d- 4ᑕPwϢ rc ]vZ^*Da[R>}yC<+h"K |<[u:\:hQ6Y{9m$nQ \bjF>5R^CpY_Yts{;xD>Rw_XVt(>5~)JRv3ػG7)M?OgbsowVǵv]O`7+ ͋2Ee[F~ %s]ꝥWKʯ}]:P"bc#n.mvFHgbCD9܈Vhw\F?.~݇\7:xuGώNl2_+7?~Jh8].skJ"Е/G4Ggٹī#\V?7G 3KhB+1aWd.ߋ[\M4:)eq͎ܕf!'JٶC<`]ZXHzHjkrM5UrBqL%y۳}C'W>( Y6vK8B|׸um8yţ1KrxkĻaJnw) Rp&,^; OyluY$oP @rd1H`6z&*;0pehlfJQˁƪ胈DEŬx#74IYS C)RK7QHV:f-r C@ٌ;5]VzTQ3g/$dBX0*E_1P XWzD-snG8 s(]mt EC`oAX@D|6X58Fq`$C2ج .%!W%) 1*^ ,,Ĕub@N5P;9aԯ_D` "VEDYeCĆ7zn Nْm!Vƒ{4 t.xfHlGԭ2gEs|cJrА\p&EíD@PbZ@D]9wfyՑpqޭ&א:⢮bCt,qgn-,ѻh"1'R샐ID B9cPx(x@G5?/CFIzGF,q{\~|GWU|ω-˗C2T!1l=wfRl]:<0("o [#;;;;A;B%(MYm>J⌐1Q f`V[g[NlЮ]y\$;rb3B|L*(M9T0^ \@2v!5qs`"dqzi ˺P\q͵S2|4S'z]oL`.ʠILW4RQb{SLYkzbLWXz #2\ \k:\+op•*z/pEaWJ\}1p%cR燫bvU}ĻUrc6J2.h<=*.܀UJ$eX2F0Iw,o?[Ʉ K>N(u5d bF6PzjZOMi=5ԴSzjZOMemc[j4ԴSzjZOMi=56ԴSzjZOMi=5Mi=5i=5Դ.6ԴSzjZOMiяD|zjMi=5ԴSzjZO嫚IeU##贋 ,Vi}XARN):`Fi4ZEc#0Z*yE#W\j$*@,6FMt9}+o%9"Oԗꛁ,y 1R-1H|9@{C5ԘI#y SfO_ a!掠G!Y2N h@bp.ifLb:9I(|oH$։^ZFhŻ~t$WӸ—:\([:ֽDȼjeƾ4ZHQϘ@R a]2TmzZe\x K8Qaqړzflab?q,*u_{N (kk3s8Ԓ"rs̔=^hybF0L1i4E)RcN#\Pk#9+p:DYoQ^[x/z9/L#o8@3q{=|Nئ>~#ۉ,4f6 Q_W}Gb(cD1JhHo)@HX3pRr8jG>;OL,DO< ^21RhF,AhNX41cYY`6[Dc(ipg>m8r!T%x .s#h!I!zH<;3>kNU[}B,s(4aE1@,#;WK 9K zIY蕕Ju 5iZAzҨ'w93G̠e֒1{%=7"Q:D8`r|v5nnm&NYu5fiߓ 1haՉR|iP{6I@emO(503A\3)) 9jTz|)r*4]G=`lPT{qOeY5Y{$05&. xaVHCJqYn-r MŜxxjO j9#%x E8UVxNWcdFPvMu+D&_ͼ>ur0p3GҮXC :R ԵA;K; ăJqq3K@8)'5=h +' kHt66-r-O/˼cCh~gew8Zh&`rE;"}duH1z'Q:BWT3&ѱ^pD&wLeUȑt"]0%Ir*bJ5vcVz)QQb*&֖iTq%HH<3Wŏ߂c\6謑988jl?|yUnNyVzZY>|>,!pPuUrl奬R dm+ku BdSap?g9\K "=\B$\quDrI00̸G&eť~}&r  %`JQxRMq\Y1H)ܽr;%/_}\-}Bǻ.^ ^1מeZWԺi]m[s~}]Lf!Dn-w=6~&DZbqݧ9wNg#'׋yMjlIs7]Ӯ!wt%3"]~q=&|6毤 @O4`#DõZOZz'%Wlf}Rտ)qy~> GʧO֪âIȪ䂪Qyr Ɉ(a4,œExEJb6 ?AMATQb)XkQY %eR\ʏ dI*od %!2Gҙ J61>3׳!t4_ՇҐn04?kfay5='|[`ӷRr˕Q)K" zEfC<94C4X%MvCYQ"m\/R^Z0ndm*)Զs(GGEK y2Q#QQ#xeEO%{<&+l4Z4R%<(P3OF%O/ug/E+w%ǛzRgl)(rL#:H0ͭ@< 22rǬ3Ș6D#[.hٮPc!PԷf7 p}=>xa n]Hb&Σ8HF =D:B,(HIL>x_b}>c(nhoqPsqQX<9^3?'ZOyn+IܬyqX}+N3JM_mpF!nri!pqW_'mz#6`M ]nVxg>C2c퓞:<:v{ѥm&S7wmJSrY{Iwĉ`0Ζ~9}`m] U0ϰW>:XG!~ωDΧn1o?'y0Bw璼snˡٙ'[[S[t)E9k[f.>fSw_pDjs>='MZ N)&~ IFcQmOLeGfMams\3tGsV><&rnղ@:k;6{^#[2i9:apAfS4<8?~.wIdzx&;X' *bZq' C`BΊ,x#s};3JR XH422lXҖ3 h<3 !Ah[8c(zzcUQ78lxxBڙ~qeՐy )J%acWrG_mSÆ%B&H;dYV"pHN -T{1v_紀0% G{3̏VK`I=HRN*Պ^JD*dNrtRYB{D.K -`Io JN(3tȇdHV&wu1oO*s֢ۙZNnrҁ!lAjʖ^j^z/j4AGx =*nYfڑ#h(#+ D%:PYù!@OZ* Xq$gC`.%01$+!0ɊNJQܧDI?BƑV*G_Ǒ_hHj&LbN*im #2ˤ1;(0 ˹FFꗑN1ẓ-&,!q+N DYH3x AR͠Lgs`aa)?|S퓦/bΚqb^]w`n6=շf?N9t9~ϵ&}ꟹ1(~m[h"R;i ~b|ԀT$!S[-I\GNFLNkVxo@n Lg 4#iZP,i1 IzOc4+al.,\L?d4^t8tbc3.%;0z<5^ٷΔ3l =1 AȌŖ*wPuk”8;ɯ? Orkb+~^gk\vLJGIf3[Ъԯ;5SbK}>ko \eK- .VX2i7y ԈZ:ӫK: :{F_*}`zsdw"BA^|B|JgBKM ف OgNAf/EK5=), A/&O^f߇:k":pp  ."FBheM0{={zʺ車ZU:iu7:U:Jluwt){] 9dJӕVVYb~a1R|NyГ=yf'V3( 0+0l4Q'ҹTN25|cbY@۹ ƘUDH96frS|uzG qpt_ t8 ZNGֹЛ[磥w` {>GJ:N }|}^oH1J2\a|6BVmr\%`:cvPk7' IJ3 A$,A B?<,CFF $ ]ݑ&xHbu}yxk.? %Qc1 B!浏t4nlo ,+{$h#J*H*bTr*Q4~sS{1Dv(Qy-9jSr#iDFiV26c};H\g)?u4ն5 :xa5Uºd^Ŭ fQ ?+QH_ p\Z t(w;&FoRٻƍ&WJCÀxɛ7$l,O[FJ' }yȲ-ʴD1?jVTUe t_̨3hQ9zR)Att4ճ#r6,bWo=nl¢`W Yp\ox$1촋L:d`RGSJms:c2aLFTҳSHomҠ3WK-ǵ]KC!fH `c*Oc.*AIb@OkTdvCf2d˙Gx$} l!,y݇i˵}|5|7x&Ńlvϋ}?B4oFr+nUIѬ|^KnVԷrݞY}5ҴR,5*94JNu뿂̊rea mpR>?)sg&'S~zvxTzv#8v7j垩ՁݨTw{dyXLv BF#gwU('g~uRNB$@tXguZ55- rpN(BqeU`"~@PlZ܆,gy g۟BG(es ~hL?8`7fْlr<*'oTi-K@bٳ{;s)'}%PbvAG <F3>Bm"A奼\qXt_4Ո4n/>*jsc̱|13ݣq}HӔ֠ RVM6eo1*5b/;d?=?1T58Ŝ9s˭#ĺY#VRگr^6f;,ye`wK"kY"M EV,E\8&ˎƢI tMRޢy?U"D.WZ pe:CҧvgKճԪЮM>$JTmW,= JWpȕX*eE SacҮ`+Jjq,pR~p\=\ .`IGJ r,pW@=\Fb vWHt4(\Hʡ:YQn$Ӊ*~x\5C^P9Ös`m(BeS| ;eR`Ѯ Jћ j%.DU "\piS~_^Ⱦzյ)i/M=VO0ÒGc:G[ᔣN*4F0VZeDXBJ"Atw xގݏt9 %6vMǢ޿NZۛJZFܿ9yk.<;m!6w0e#YMXRp`푴c >jϘLÝYNBJ>C0'w#x<]'==#R~[w'N(w'5#Ue~;3w1AT+Ęa4ϪX.Uwۂ}oM U~6|YbCA$G|^jF["FU`y}eDjist3$9IЧ+I8#PtmIKspWkƷS Ds֠ 3b }M ;kwbVv@c88DPҠmPV# $0 Rrfb?Ct֡T N|Иv@|I-D`}GI(>M5:.0⼵ 'j΅SҞ3(R= <$RA1'b ؃(8B`Ъ- QHx}ho q4e(mP|fO* %B#!`M= 5QAJpDRQ 6P vgӒlVbذQ: Qh4Q`Ts8J(\G'E,% |)8"S|ȧD9#9BX(`!  3W:SbI6Z*Bd+ ?qpj )y1jI4h9X%qP=ptU7=nVV56b:daIbÔ,pqT 5}*@m"݉v= & uEe>1ՙFxQ5b!PgLJ֛eJŧ$ n .={tKe,wpGg;иU^ϓ`uf4I^4ɫ&wiouicBHyvk>ncVƒYuli[eIr,s9X.jHc1F1s&70?kPvn3ZDkq.JfgeίGp{PhXk, +%bN%zi֗1˘hshB3汖2y%DR[LL1%,% HĻ}'C0!B`Ic`L[30{-#cԤXֱ!9-˘>^0x.aFZe4`@RZHc&MX$ |6\Kp3"اbLIY,-,xGAa810tFN oLwn%z>oc%s97N"բfӫAH!]`q>|,}a[LfEa2߹$Nђu:4c?*}]@xEԿӟSB.Up2KVh3(J vHU'm/ oӽlѽwݯ,_L/٧/# N7/gCvSd7Rycb,QHUxe6Chzi?<@Hu6 mc6^O4^i7-%iSr(0.Ym3٤K'i&I(G_LV:>aۮ7 bFa\jL'VhpezdqLJ>vч8 #??ɯaq}4?0:%XS04&<6PߟEh5=5Bԍe?>?+>\_}v~%t`sF̵!6 x$>^ GgY.crl#h ~ggebA$)l41ޚm(bﳷH|6g~S:/f?i;+o`;էj8_K&[ɧq zl(CxǺM.k ;,/@/k֕تeq6^ʨ&˸}z2r'7>%r9`N^մ.WKb_7ۃ Rָp`t.^N+]s8 7}iM}ߟ LI{}!IV/>a |P%S֕N^VH^RaΥnKEfOyw`,"Z_.GD#;U*C+}wSϯ|4o/LBuDDPyΆ7 Pjg:.aZ,^vmaۤ M#s2ش{۳DDm_T3Wۘqx1,x"l{ٝQń[IT[^ق5]`BKkV$][*R;aWpe@NIyCt 3z]R-TN@/@/ت8[p 8J) }:JQ<GQǼLsBD#:p9G Iv Ts:%Hy9YŖ1YŲNv0E&8h yo5$JATi E#<(E-f8(K=HUļ#ZGJ' (WJlB/2 ǨU()(| \KKJJ!*YW䴌_*:kx rGs yU@ѢEADUT J \$0 㸱 @EN{@%u|GPW޵#ٿ"MA.3`\,pO7Hrg}ݒl9-KJ[RI*O-Ag,y4pF@I.ъR3"ZQ2gi=w>*'fQxu'b48 !VrG !LR]Jm؟THRJY9KZHBK&DRqUHKĂ`J%jTJ8G66+Niru3:(#H~q 1JKDDτxI=0\ [A l^&6"mс mPNs8b4ͱ`^r.|4&X|w;6?.~( <U~O; 2Z)XP%>gsN|~sހ߱i]$eƮ6Oqwmm'>>p0W=pQ8)9->N"_W8>&59t7m(]3`Kn S\˘ɴT%eQ2|\sep,Yxj 6_M;}Q՞,7s6D]96eWEt keyIzJL\ r5 PJr핗b?4ԖȄIB81S1I΅⩣Fk8(Jb1RdHP~ǂi~-Ĩv!Ă2L!Z"ՁHO9[ :LdONWNjmtO=ϻ6=[9*n?$nQHHH JYP\]80JPEY&aȨzTekuxCErYp0nu4[mMr=h^RK E$Ol\̜gZ8x6)-#(+gy /)k_m>O~*Ȧ\ 5m$%'WA_+vbI2hr#>hAFB&BT8bH侰ICL{k*XpQj w溓w7_6ǯN++A  LIyMkά7RP'o;goܝyk*޿ʠ`tJ ]9EV0"FrC|k;lb~f}ڈONe !, 3C0)+2㻒ձ6݆̄цņaĜd@ȸ9nD4%m*fimT $ƠQhc"&ijKo3H MfheaX̜-7>zrbr<}$ 8h!,HI\Rs)ET}УDZj\4.:Q#'@.}B%΀d|I3YS< `Zs*ԾD-f5M#HKq#;վ{dދcg?Mn[a uDZtDŽpLCpÏ_{⓲k=.r(*oQIAZmHjb4RsX>ƄfAD&Aq-/R#9 9%:`z^ ɹ1fV dRmʕ #c1sFr\\ɋR#cW,X(h¥Em̌7=mBW۩g 7Ë;G $rc '%'F\? KRRYEg, !!P!;{6a L*/з#$&r4Dҥ 2ivQb6TԮ i"F|T[d^Kq+' Zd%#Ly DYT;-r_̐rkFX #JdTڣ{ 1W$օ$TGeK3i Mki*=RgSD|2XF$֦dPq%p/xXlt슇0X lVYl'G ~Ǽ4TIp5@ُ ;n\..ȉ::o0w4)C54pTb %;Py]QEO6'%S+p"JjC2&Dd$;D kKէ3vVi TxF'6QLk\ DK˭4H 9O9.)Q5gW v=eA]U&,~M&`q;|@|UI)zڒR&[+Eo֢7?`-Тp+~t26_@EÃ-$Q"Z L{cbTmbAt0,h5T"O61Zsw]g`oxaqqqxq3v8gPv('&}\D @ ER cQWHS(,G>Gؐnnz"h5CBȥ~pR%yc@jhT(ReuevWϓ\ y$8h"uãec`I ()۷vW;o|^[{bI2hrkm#GOi\8{lvf#D4g~-YrRGnrxb)MV"_yC&U4f,$J* K@z=S<~zFmhqѡ ֋8vwWےC>]TEF\ᝌ*@93kEX^E~B VCϹ V ztwFT-Nvsjcygvo˿QpgӀ/xf׳iM1]aߘ$XR)_R[Am&SgF*`QaRre8"ce%%렌BPQ9J^,@b S)tYzNUg%p}'y ה!j*ME4QLmS,3]O&c_--GcyQTԉ?3BwA =(?!2᭔*aEl%1|4wigHK\qoYRTk9$x T)uW=sX'׌3e 3 e2Z[U,I#I@g)(T.Uz0vVit{sӎjf&,Xk2@FT@"G`g 6mOw?xh»Q>"rOȵQK/V Se09 'GDr{*fܪe J.V}R i'/9j~yD8p?;QhSNny3;o`ZNV\mMϰz5/78Z: jzm9EJi6-oQ~m)8QHNTJ hp#w#{V`r,'::zQc !ĝ2X95Pr=qTVQbAL0#!I@.-J%ʬbN0p5@s؞zN#wDm(ٕnTqA+O^JMzx**[Uz&:|5G7u⻈&dMG0mgKb9Z^ rhW0U:uu>vYKLh~jHkgı37\b&hџ`Bj~wu3Km p졭 >KBG;0}3.yx?W/J.]KGԡه%/>Դ7*ԯ70OֿCﷰ_>PUQo(E,Q2l)ѥu̗`\D2`؅WxrYp貥g9( ny B?yt/bՆW1MCmm8j :5r@-}] }ڡT_Wwi\~¯x蛳Jvz6+8jߎv\l޺q(o4zyOi)\6W̦tF&ѷ˂]%˖i_mNѽ:~84[V[mv|ĉjZN:~o#.f ?!_]us͛wrV՞IfhMCZ[6tYVVWɷVA5e5*\y\6wwMEWIهߊniT_sn}qߧ{f9q˷'B}n鞙5Nk/nS]j?ք-ԉ!̦2,7,0ZL^;qYHU|;p@k]ڨbtUU d<'ę³Uf3s2Cۧ O "i/c@Dt*4'@yN#FoYv,>ODw<"+-MT)UDɚw:vuI9ٿl2njr_2Ib^xS6`d>GZ=7I9 ɩ(=׳ QOc!. cYhJ2K@.4ޠ!Πű*V#Q腉=޺ӑb#g!c`"ܓq(qiőwg=\}-ӲSwSew(V#j? sP. X vѰs@8ň)BVE,i"I)ʨ$x GɵNZQ'X3;(Rt̩#80ݚA$P S^hZ*J z&Ωaٿ>vA$uZH:V:~]Zt7b T.WӼ|_FE]-]jV7.V?U;5<Q9v1VMᦱxU6>tyQkzy]E/`@ͫ|-ojýxU,*g7"`C$[\[ `O[,y$93 U,ْ,K-{0HEvuu#=bsvZ}{cDn mI~v8*E甇w/]]_~h.<|"1\q,8.J`)\^HTݝ$\|i=6W  85iatHF\^wq#pZu12iʖ,ύnE8?ߨ2$)gڴ"B~4+f?[Jc_4( pq5=_ЖߛWXKӫK 4RTIKW(Ldfm^zZd,ͫ'0OQCUB:5˂ oi4Xqj!Qs{~#DD `[X4UÙ<(%V)BW)ge=5zj"NE,_ɎmUՎ7s{p+Pe~89$!_}6Cvc;Z"1Wsb4hV͆9m?5qNo =͂eVo81h$i`J*m$o/"Î)%f1 d:`8M(rnZM`FZ)"IY E818DՒP.>܉lH+8=ei -Ak|Biж;ļ6\L:_J:7z4ԕk 6bEU9[q9A|~Q)&N[( ~T:upZ5$$oSҭV u'𙚉mҴfӶۥZw|"۞oԀȏn\7p, ̱ T[b \cAZ`TySWW@"ǣ4g;USԌr(%L%FMT^b7f\:$ww{r&gR7?<*eÖTX^NQtО=,|Ov{<Z4zo.O}7W^nB`Z]i9V0#(b*رJE7L0& ^%ZTC6oI+i]e<~əSX1T)b+t@HmOj ZysQ KP>5|.9AkЛ)vkf|ep^RiIcqκopi&$ɾ*t=-nK{ݳ3wɋ`dk6[pNZg[` J)5s_np7v:-I n4kv7U[kӔ/;ieSޝ7n#X_@%_X ?SuqG\yOZ,+ 'eaa bҤQn8c{YϳG,3ǝyNn |P p ~Ypl mO]dm %_3//x6{mjk.~><3g7 h> ߙގ zMm򡛼du/`^m2w#݌G]Z'>0ppڿ&8 C~z$JQ`VF!ta{0BFost߸8#'"`DQF*`Lt8HYE96ĭe[o@vqXH-(ym6^(a[X S{WN&+Fz;<]gi&SKgv<g/{$wpnǥ_zw8@dmwцqmGpWլ'\8 `[Y 2=bg50?R Th#Qkx@f)EN WztsyN=wj7:5k>f۴g xwz9iDXY-= 6ubD6&0$Gc@\$-%]7WIJz*Ej~`>Ĉ}F L9 ANGU▱ՅOɅQ>{T;4( |ݛ\릵LZ>۪+}w[EokIq.0zX5 ^,Ccw4s10 fbrH8. jQfblM k/9v¨R1|%gS4aa4&`@`0$,$iUC@Jha4o(J+t4*ű+t8IIz[4WZS,:o95X݉on0s8fOO}D{<|%=(t/-0*"=a*<0>C!BǴגdzג}ֽ\}v!7Wj\=r)B-1 Apk0O~6:nk/; *U$^'8Z,y؋Ģ r"shN<6q^ݞeD噳J:^DxJR hυN395N |Y&oԝOS51W<_՛6ZBK+m+x^u&G^HC9 ,e^t#}P1h? 1 m;I?ZsMNTѐYh^G0}aog:TBuM4[SCҁ2RZw\o"Az5V)-G$A,+!D!ep0k5f,`ڃFMFSgg`&K%. !W>>?}DELa0Gm5^1%Q!DAtN%vmUY'j`Hj`b3 jxT~8!,LI(O[R58Op1+ Wf;%h˭g,z4j(*yw}SEY$o1tf_gJi7|xNNp KRZMff%c j_"Na@<Fi<+ޫb(\@rE=qQ/jp䣵HqNA1rKu*MRifd j0D0R-c6qJ6[M3v$-ƒ¹D٦ Ynk7_veڢm_~sBeѧA2};IE6  ""ZD%#FNbi V4r8-o o(K @'ZK(If ͵R:1]Iw.qv4 +V;vtjb ݤ&P/B;#)d09iEJ\:A@x^Y1 3`ю @zIF=r@4KRfq{Xp2]cWH3[DZ,b cd*+T)Z0``BA)@#4(Z@ n\^Zadg($ dgNA`Ir4a@1%fX2[llbt`-IN.ufӒ]"lY.nvG㈥trJCRG!3ncb4D;_!bWa6Pf`y52G>x\nAp%$BRXg~]Iդ4Md,!(5VrHI/+09$xxxGxG޹xGѡxGщxGM JƆD&Kα0P䁢D# ɥ%_2oX`*meY+ [B7zf0 ^И@gc]RP# kNofmy]]%rä́IaƋ1)=2׹A BxUpHv ާ﫦־:`+FRW*LvC7qdȧ9]o;ʲK3N.+Gb.5:`qL!Bb4 0qg)Qy %%c88/ LD+T4hiB$ L'<˙')f޼&S;'9#~<~&A+/|5ke{x[,W^^ IgY4m*ׄd&8Va7!Pg{7W!Cha`plv FQWMR{DYi[[dǯB.H Xރ"JAV$#Dd ?pno[=j8gǑ(h2ih͇KRTk9$x <\V$9H|Og9~3 C5n-*$R8g gG>Wf5϶*~^څc  '˓'%EQ*c qN&E,' \or_ΰH9((DpZb"A$iKRlRI N:IKj$!Ơg!9 ,X޳%TAp"8*'{s/xP{anFi >[V IsD.@sx"J̴Wt)}*?m>2gąޛ3C|hÀS S7(dXt7/i,٨H"Ru`c *NooE9AUV DZIY7)1{9.=D8Y$M^(JIw<=}ƋlzHI"S Adq0PC $Q!xC)΃Z+#=K"8`!QF:AX 7P7@ i`W61#0;b_ٽB +қo9U[Uw9㎧"=_1%٭1TtBrk'WaɐOsjJK)Q3ΣD23 w'#{qb,VIDj5ji62rR$L()9~} - (ܛϳY>38l]ZxK֡U^L,'.]^??!em `[[Mt;:];`lb[.RCU;Rbz^kwW S{q[V2o4h뿣_kMCۦ7n?f՛v&-Pղv:)5o##u [4;tT5P|hPf Hx>]5,G<-Z5*Mh |,K*_~z)r)2Z/KA=+Z18!_jMoE?+|9`v^G$ji?t<;iB>uMD-աYATClIL)#uA.rž*紗;g~!<*% i!ĩ\Q[JYŜ` j"Q=;R{ΎԩwDEu)1g۠Bz;&h/vȳƯƦK6YB<_6k!ƟQ$ IrhAASD` aI"H*BH$“HL $N9A-(Y PZjB;@Yh+Д qyE*(4$idZa6j͙DLIǐ<՞J1p2z gh<@="p,wZv`GwWlye.EU B59 Ha%(SZ2TQ3AtymG'b ZI&j)')4V^B$O/'u&q’z4gLRDXQE 'xLPFN\ׯ`֛Тgw} Խ@Qߖ=ED /!36id_KSRCxD& 5J; :hq`Dbb!9(>h( IT"T&B+"d4Lp& s $n΁"g55t cu??\O1'C _gjT|V݄G<'`'?uPwQ|x]-bQgK.cD"!Ó4ˎ'}=Awh}sR1WLFñ (4҆qZ|u!>Mkj6E.=m~erFvHG(.,_9լp󫝥]oJk,ߺ5(+_^^Ң,Q1Zꨮ160z!pV(ɬ:k<( ʜǷ>uٺZǭV׃V΅b\84:^׊#K[ L*9e7rV'`uXZ]"mt>t\ammK݃kQ/ c%OHro?y(gvӰv;uz6'c{iXGYcZH혘j'B\?r؛{>o tpBNݶ=8F1A;[!²3l|`L;t;_tqoחM΃}~x@;s 3 Ύ+UBC*HJ,/[hS( vHMh=%FcnmgaC| xO'Fr49ZAn0hGtE [XmA"2tVU%hAns~|Gs|Av{[RizC$1XpDj0qg<d`f[ k=7I9 ɩ(=ɧE QOc!.rZ5y&HǪd. "_Mq\h8AAA$q*VQ EUΎtG%v^ CDzQޯ-Wٝ5U_ SY FٵrUx7⴮^Of&.=-*]~W=5,QZs"c~W#\ybgJjyB/7|H-_m0:O֡q(+Imo``hNт:YQ\xJO;z>Fp?Gn>;O[ȧ4%jӓ[+R,!s(5VG%cWrtU&p .N8-{Fp5 w@R t WDk|AЮ[YT.9v9Ė8 q]|Z$3VvhRӆt o!5ypp嵎h5 8{C bO֨%G /I/s9 J7Loᒄva[ .)n>YQbQyryaiC0u' }{ e*)SaA.ԨAY]\zcS` b<^{M<(!{UЁfNgΡn%B^(t@ JmEw6)jy-`"r6 S!6jIebJI,Dœ˄ &v-=`SyY.ywF@ -eNj Qp6)Q)FXBS .<dKRB<1p]~v}b'+Q06K%U:NZi )c N?\3M$-)aezK`/`%m<Fpyd؎3F;nxeM_ErLzp/a8^>V.5ܬ,cq}4G uqiF~Xq^F ?೿.jl4\?6(GmP FrZR} wLpe\$d}S_Px{Ei[? R.{9XV7UIUXX; q|eEWfs(5bKϾ Bm_ٗEbv6@-'Pm*͘ddW^L6ipI9kzjËʀW'Pcgر+'b j F|<N5=Qj*`AY¼Ň$g U5C/MgQ/솚?\姦uupw(qMU\Ez.> ]$Q%Km-A1]x;i.Zv|2tão~7 f "I%h@ʘ M-_~W(Pr>B]Sקkb;~0?~Tǯmof({)Yb?.4 i!>Pq`}5 = p\1!M}|+a\E gHpլ뗹D Z6>/qc-D+ .:3+sK-%ߌ>={sy^%s`4KU1 Xd39xW´U;U]|^PoP(~-5'`))t[0#_9OjR@:Wb6yL.9s݇mlZ&d9Hdg8wi9Xqc ݭ }/95Y.¼.->LHXqS;\/,r,Db$J7VXxC&Wj8[%["a UMمݙ•ޞ氧\.gJ`0jtb 6 ˥CRr}7Q:puˎas1K&yZLli{|&f;͓*};If>%:TMM[d7+X4z {zy {}R gD[5wI v[ Nj'QCUB:5A@Ç=WxK2J?N?]fwxVPLSj8\J2xQ"Pn"Dp%R1bO=)Gh(G4m::,; Xꉻ,/s]6\_%w: #y,xfJq6+0RY[FH(򥖜 VAfxiP@yĥ![z3-1hC/ C!fH ^ wE8*inXeZ&#ܦR,]6WPaBgnC1'0*P.7 }{|Vo_*9nMK_fԠo3"ْVTjT-h֟J;T*ETW]agKIoV݂2% XRtɼeAΖk8gN)S'\Z1{BpiG՗ÄQ88ĩ`3BX𿎎Au\ ^5HMbJ NjW+7ݤ$, *d$(vح̶ZdwLWo5!!] BY&$̱ [b 6ec-(v# ("ˏIl# &uD%Ԗ w Zg,Sdw||Wg{ɮ3opmi=%PrNz2MkiLf¼iɆ#~قeŊ(EABƒE ,ӝ>LCi?zdȉxp&'CXh8)')NlϤ`,^>㑃M߹xZDOpi2g=3iQ3w&ou͛⋷!CoUq8jY)S'XZBi>FF_֔ ڊo7.7?&R2v5Jմ^y:x7Y:yw:ѡX7/ M7d9#D6'(ؒ2e0ƿ-TSEGZZ"C^-D`bVu 5+*fػƍdW} eXܝ `w~"Hr& V,{,[i3@"ɪuUkn|<\ꑙ+WqZS/E{xȷ+ }jUk8{$5ol|-QNgm*Ț.z$˗[Ïh"Q׍^_x y6"h}7yЫc)k!w9?>OvuT-)7>n̫G6g'/j:ov%#@wRD*V.upͪZcirjYO'"/O2sW/*sv+m3x[;g,DY 5j)1\V Inb!:'ed,rYP %>%P FN.QBiń(UVd V; Tx㩎F&--Xi#1H bp ^r%01*<\+Jaq!龕u: (kqGU̥^1> xz%Z\ >?X+0AI ټR K+_R "hƍ̲VBҁR{^֊L '(dwZ4:MXƑ , #x&xR'&IqXk7sY)%]0I>Q9TsRBp HT"m*imT $Ơa(Ebh8o3 RDy &g F4eᰱ-Zn.Q7k~|>_=Q hʀ?gA\RLDŽJD'{iIPԣeNuDQQ"i4)D<:I^D*Qj_QQɴqaitlY|q:馊{+eMEJ>pTPlŚ|L{3djnӊ69o6r3ei=-M{h|_梋_sDGjCt)n-5IH8Ʊ(3JQH-,x NROy [1WbTrKJ3c18w3cXr!+˅\\Bm~# =۝g:No?/Wj{v6xv{fH8%>0pFZFIr'@Қ[sN~2bzTd}'%~xL(ZcS*I53uҙ4GXǤ" qtT.gIh :Exa?+gK?k'fNNn3Tܽ_0.qIƽ p x2<9lUjCv(dPE6@I?fr!RCo D<$a Iej lINZ9oY1h|["Xܰe!5&'2jEq3elAӆ܋NηS|w0}Va46Qn$B8Du.QLE ܥ m很 is|ѝYTD|-ggp Z1: wIP92 -|Β ]*(`SYH׼Y}۳8xHR/sr:.ݘ9e0$(لq**$]0jsƉbjCio .:4/8`xCGop˷vEGq?Lv(}to&Og%-06 .c{|z7@n(Ls$T.zdϚ UMf}H M ܐo#2a=w.Pvk|V@5š#sÂ.znj~}U{Ş/jj7x֒^z滪y1WׇUZc G\yyZ-+f̣U>-xjx'T밲"Xk"j ^mPU6TJ)$O8+E XTYZ8 <71QEBeWFZS 1PKwQ?ygс>i T!2+]][, 鉔Q^xY,4-Yo+l쉲`υ C_٭WL88b0bN3!ʉ Xވ א9D+캘(3s&we=+ ]e BWQ]+D׆؁NW|W_ߴ{679,]-[tAW|z]!`Aeo*/t u(U(I +{CW.'}VUF)@Wϐ8 Gtkp ]!ZY]@Wϐ8Q} T_*u(׾t|JPf2\ޛ*u(@Wϑ$'.j68j|0g#`T)sߢO/L5IK(yealT}v|E|vtT멄9 4q8v:,ċډ\$-S𖻃Eno.|m.OY}[ym~ٛf}iH~S,g^b_kÎ9yٍ^nh '3v%h|Ad[S3o5RT mηO~ksȴ|QdFFRbX2NBWH\$l'CW嫇Dև/2{|m׽r"KWCk_} +ꡯJCL S)zCW2}+Dˈ:]eL t )F9]eO\BWJ %7Ć3R=n, Xl˘(pWJ=%c+}83>Uuskt%gHW6`܊n-th~t(ztE!H++JZ ]Qa-tVNW2 ]=Grn?c*+VS 2ڔ:]ײvʓWʬ_7 7ZS+F)Kσh}\]!~:GrwE(s'1&}|Xȱ[9jZh}ss$и|=nMtt7Sn4y7g69ZlpO8zg?L鹳!; ߻p%1;&͡5#Bj>u%a|"Ν9Q٧ϫ 'Gy"Ayw|?[f Ep>o>?_k63=y3 1z#˟?|֒z6GZG۲K"oUXwǬ}ފ1Ώ1GE|:g_CS+_wəǧKٛuU*[G ~JΚT|q:+gIdRtϺl%$Sr!U-х:WkUn 8B.ڬlFP:?8F:>ٟ:аF*aMҩۻlPEVL3RDK &XN7-b0&ڠ-h:hѵѦd鱁81p e]f뀬-PRJk]$B"s 401ИU7T3F+PtRU=QKʇ<h`D~8&6.M݃66G%cM!:F20T`&!Kh*gs#QG oT#FǔyD!8|{z}: +DIo_I&bTgQ T,Jv>Bno|B9 A:^yr}> 8H%5Væ tF:3:%&z7'G)e{r5nj9o3|1 <6y2"$$XK!Q,KBJcuߺq)hK -F`1vѼXm6j!G]чdP둇 -U0ss`J"s ]Vzk#Ҋ q69 _ E*Us`#L$X EA)1أԡti~mmWJʆE BRқC:_|"Hmr ǖژ[Ck8IEـ0#Ȍ,Hw0`T` HJƬG6lmy!C *TD@5w˽AAQU6Ms`q]baIp* ca22)LHpm $$X筊JH]鬇-P*ӫl5cd$YFhU%ٕӘ;F͐jPob!8w(c)(luP/ BHP%D&T+Di?hx"W X)[K1 : 6i/:M;*q b LN QCF΃u\6 d&/sAGM-V│D]`#)#͂EUPhϒ (Ez@?Pi_ ()X7UW ,u9kh z>tg~d(`3X K| ڬDtP, 6C ZO8=(aL! #A KA= %^,KTA BqŘAQRt8!a%ws#v3¶\#oُVV11󃳣Nu`mFL`-$ >:gAuPyP\J_6#JhIWo#f BX9Y4<#y0_(Xq=J+e`j?&\nߝ^ܲ[aH1wf ҕr[:{ׯo>b4fԶ^׭/"کARŵ_`=%6ߝ$afW4yq5zaФWǭ_^/~ћ/suw{\ѫWڄ Qh~V9_ir~r?~s7Bm9n4m=y~c>9F7Om=ﺭ[C# PJѯ v5b2y:i $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@zN^y 83W=@A{N F8(J'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q: &9ؒ]Ằ'C(9:,N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'uQJ`U Y('sty N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'qbk߻>ؼx}[M~{s_va.N_`񼰺&㒏 wEGjﵒ#q飡A;&`oCW ڵu^U !Ŵ"J!.EW 7j:]1J+tCWC^*^Q)LWt8+CӢ+rBW;:jEtdjڴb=ub!]VDW 8] ]1Zc:]1J넮!][WDW|\ ]1ܸueQ?ubVs+b;bޯmOeBWϐ\ ъ ^b6_bv,;p3g Pgɏ۝r-@׋?۝\WP`?tv\Ϸ_>=C{y3yW鑚mlX-m͵D]כۜ/w8_2t;~:N|߇܂//͋]o.Nʯ#&᷿vݯ#wvw͟/}ͮW/ՋOƋ~ۗ4)uӎ5`g,|E/GS,*<M>?goL%'6_@z6Qym|moyK*]o;eM~@&oߚmNRSr9Ѯك.53E}iѻʁ|_S`r0O&koV/XPy>menxO?^rpbB][oǒ+yD}ja`I%n1M*$eYEDR Ք(g۱gF3=uw tٱ~L Fי>vs7;[kD[nYŔ>?~go.= g/의(]B}R i1ʀŝIJF$m:+􍂬vZT6l":UgHGBC 5BlJ+q^n9UhL!cC!}t!]9{ܒ#\sӸs9|(Zbp]TZmK"Y3kg v3˶dSvY6v;uOQgxBrKU:kl^}]m{]g}!?vߎ]#tsRW?NS/u?OZVc7菜nm^pýkP)tJ{}-/NAԿ5{2gԯtuD풮,^#WYV{}-jg‡ݱt&9On㝓urjT#~û=۵/ynUIzw#V {'~97=8~lOK[c5=679,vۏ^͗jN?)Ֆ|b}j˸YZO3ήbfe$ZϳU<6*5/?bv.t;>n^5_%Vyb,|[& t{&wU!瞵 "j?YO Z|Hk])jtœw8gOuEC j 2C,Quё& _NGQ>DMX{XNs[ޞfnvŒޫ̈́0(3y$!ƣ`kyZ~7J3M;+*Z0b[>@\D} yv $(x7gɼYlǷzn9{94?JlD@kßNp>}Bm%ɼ/^Ƕo>6 <;L1G]N:3@P4 ذ?8 =]vK{DL&TF@kSAxM$SM"0b#aҔq@MP [$Nu~6zKIͷ L՗YE2弄rKy{q?z1^2Gw! ^42{!'2| u &˿COHbj]U|t~͙rtЙMW^t6k 99~^(t2ɲ.@ȬW(d鐽u2;M)L!5d Y] fy~ij߿^\I=ly5ǯݿlǗFT]_>Qvqf^~bK+tݓ Am5)N>[Ys*ب R6]A)>2rM5%BİlLG B0e,@ J4Bm~'7g}$r6.yw=gx,_Yڀly&jZ,kb ؁pЙu+I<+!_*CtbiE3ed`|->PCDD6Ԙa_7H"xb6H(GmwBU/ZBH'RpӠMw޾]m:W~'1x{(^Z|;W7V;WVemuMFMmAlyй.# C @)>毹`f--E,肎!$ dm"ոq`Ĉ&/I䜼)l9c"Fc Ҙ,8E$Ac\cl8G.4ٗ-jgo'v>?€TF>NN sRx0B9KbY 1T*j6&Q.JPe[@߭KI&x0E"6ӊ?- K󬋣 *yݺ6Lە-v~>V5OGª BDQD]eldA$Hؗ7c3Tlz2n>''F)Fel6-c;6Smjk lYmfD {.Oa.fo;:UVxe`Pi=)m`u)2#CiEΞmMbQ1TlTlĤ:{ll8OÞԯ"aL>NEԃE,Af99FEVJ edK(р"-&@ an2祮0T;S$f'@LZ^b9":TGn\l&%E..v`. Y42"'6HD|yRPd,f xx)tj]c{Ng0az5:_O|2&6>b$!t}xԓ׻\*>T׷Qo͓G{O-;.vѸqj. Au$\O9A'E Ad<;!]LEť Ju.JkNDˊzke!7جEEبKϷ-*jMj&BdHBȨMb2h, `Қce/)'Ԟn֐w}cӇ)}~]%_3s3xh7r0w f)Xڗ ~t[{ӯtL'&$1}VEgDU`V%=eiR dy|_,@ı^G゗B(1UINRaEthaTB6b| P(.⓶|{wnMv[Y b^n5(((Rj殨Kpyqb͋ 뮨y2{u1%:[P S-aCQs[:5He8Wܬd2Ң?wHdM)(]KEH6Wb\]*J>m~]7z{K]9J"7b-m!)pơ%3U&N%,Ifxt0Kk4!rےqqp :?j@jځ¼19Ls Î Uo:HɰGΥ2u&v{/%/:e!y]RN~^ji2ƞVV UP+H)N4 Qִ+ML" KQoh[AɁLɡuƕ kCA) 5̚hy- [ 爲b J/w3jFocck4=i~Qgu9Oco"Uf1UYANqs$jZ];(5o/lgtT]M!YA Tf;k*IΔqX(13'S՞2 >^ iAZCl"i z/1R‚ ?T)@&" Dmv)+<|%Z6m6#S7􂂭4;&(Y"Pdf_ h*NF:+%ղ@R&BTA1wh0 K>{6ΟU~ rIG[X꩖Dd=Ua-2;0NxSL e92_;JskeརBŃM58#?58y+jĜ@n0.4=6nꘂwUXTyd+z{3C`s^;sI#3\LgWtٻDl)YKn6٬Vʮnv2֨5v(=`4ƸG袞NΩP9@d汍\r@1I5 ϳׇ.T?ߗ Ijj?XJ ]$~.]wVkX+S%ohcHЗ&"OIK\g$ K(VָH^GRoIANͤy6-/fwuָT(1bj\,06ɬV< m-S7W$i*Sjz))peGwbgU/t^?[*aQP)M\:$Uh.`C^OiSq|_<1*3S[Z~")rYVl'GE"=JT8i}hRQ> ֿ7 DA'+A `u+3(`f5 㔊1QiR-uj e'4YE2fMJp|2jJHgcbE>[ %i#`6;|@q~^{GӶVma1T gɭNbb(h2%Dp%R|S[ѧTĐhD 9D>L,, ^%wm=MD!ʴVr'ߒp-ޚq괼@BFr>U3(0 Ҵas5}L/ YM ea͡\: o']]\qXt|/ŠLc`'9s @,NK75Au}`Fz!e`k.PHbh՞{:FS24‚Sw88jvi>~s?t3~;nSAz=Nԣ*:ԫR\FhNS\Zb(T9UPEvDSv&gL3с3vwҾK4 0汖 N)bIEΤS,(#o7J x$ )[X1c2b@Dk45)g2 kkr}B@|vW*l_jٔw!X:(qؤXZןQ!t#if>z2u6ưvrg׊ q>5 :zt\L JP*WwW9.tY:;gJP2ujwM ?^y QJ0LvԷx[@>4~&u=TWHބʕ(/ zߒ4_לp{0e^2`ib1-NeE;$}UrdW9`غl>>|,w\F?.xDr-N99DxJ晴:{b{PolL hp|F(q\QF:A&`Ν5Fk%X`T9F$㑅H>H+1тFQ4`pHnY3ZwL̬qt63G XOqF8_?O3=2_1,{)29=4-xC^@WvQ7C0 gަ̕Ud>=Հ%rU2׆ ktx>e?^lhU2lx3G>dz<<؈iv{TI*F klȤH?VJg#h󯕡kb>]z^œPuo2/cl />~meOf GS!4_(gӨZ(O;sJ!1p\6ϚXN[͌!<;H N4*:~K[3;WKijji67.XйH)`ッ𿕀,&k\oo^ֽ>4O?%W-5 *}P}VV9jXjmRI_?Ȫ𞙙g4/KʻFaцF)ۺމOIfNU~~p#\}J11GB\yģltGM A+uGRKlAp,^VeǽJȤvMۑtQœi]]69u yw=p@=s 3ϽmgSH=M՛4|_9#"!M@`0B9  ;W Btpy4W6lQ;_yK[ QWBNv}-U#Tm6NEogIU{,ڜJsKCȭ'JNR=CYu7R!dx`^^/ѤVNr&w3hzV!xsRH_JmбZʄ2r A}0HxTQ͍Qms:vF6x;'y|t|af%.@ۉ0$[`uf~~lEa$\!rLhFD»-ީVA! =Z#;Bjd*ep)yY,!(Ģ r"s*Ўjx,TO)/2:4/r{jㅾqɜ7nmziD^+EI};\|g|=p;X E"KN&b%-+ m+Ipb)6]Sd*OlfWrZ1f+ZsJ &L?טʣ6Q5{t>.&ϷMAA{yߎ 7V9jf]=Xχq59-;KtMZkUꑅ!TcGw8p ELa]1w^dTc4XO>#@% \S5- Do1'1HcH*&pKX"Fricײ܌#YVH3! He$cA)+ž K>%"{XVCk**;Q.JPB"[ U5+m5.y+j<\݂i)-jr4\TsÀh&'_zʉbJֹnY&txvh>r#No){MQ/a#gd&5kT`1QHhmAgK"YrM\b`CKC2#bSvi, RpGv\6CPClX iq Mo*3X\]7IT]_rxW :N0J@#kge"TpeoTŪR|X J.5El͝aVBǨODg!¾P dxH#v˷\cAfq,F.GH!1 EBB] JbJ>`Pi}Rm`uS+ { a/:i5 cFK`fة& qElssXApc E?Q} \6??BљJ<:Dw$M|?w&1q3% Ǣzg0}=, 0es&#CiQ`J(Di@eǼ,SDaSm7>9I);θ"A B3P[ :%3KZ/6N)M̡;qV]/G>[z:)Ӹ{=*Ofh&˟sI*Z2 {{pG(n;k#ςLְ3FQ)HO`c_L5͓=jn3:j4xԳ,*9˓BWGQSaw_Yb!G,UN ض&r+\lzlT f<( .%U,<8Itc=k6=׫|j)GCPs>*FtJ' x D8 HLZi\iLDQ*R~-=pߔB*KM&)r]04&B@V9bCB Z+JpSeX:MV34I[=cd<`bȯRAQz2$\AeUPՂh2, /#p4RL&6tuᶿϢ3}4*c:8cD4! P0KL|]|ֱdWFmk"cYPW>3JDPʷٝC=e씌 AFeEl䗯U ىz2ٕnTv%cGuv|S7jXNqI7619G.DDC$&/lWyI\}/N.4 EAZ U*CWөːHPf{P ?QAc6BQ"OLs.)#M l VD6Xd,| ҂P("xCŤ-(D;|SP!2)<}%[G[7ΞɱhzD_5g}1 JQPI-H:"{%B k] hUuLeje#L##:r B8ǾE[}dϱ@q͆ ҿn~Fe9&eţ-,HF)T `Lv:8Wt 641a7:M1g)eA~c9ZgT+U*xSbggok!"2`O0V.xW%^\l% Ƕq/;gƅYx_tIB=. 4;&/ 1]$.:c XZ1Xe|";Ŕ$ y "Cbh2)Pdʾm)Q3I!@B"SH(-Pw ̎3x)FHEI7 <{ {>5g'-w^|#<_ݮ 4hGvBHi)哱#66h϶~NбI`B y~$[\]]L6([W#K:%;J3K Tmf,'._OnfZAz5na~d#zM73mhs c0H1duik 0AHguu C 1wHlw׿viQ>ҤF&^lr|6!Sr26Or}K"C~n~Vd? 3JܧUX~ ۤ;yXߓ}`zwzK۵HZI[#o=m~z/D =q1H7eV邗3AJ!*Ƒ}U ]GqDUQT}̤ 7Yξ2PI̸‡@\8k za۞K>Xڧ~L}upmfzm] ӏ9<>@Jvw]{}R9/uuj{Nm?ƽs6e-7v珍B{{|nyX;\ }|}d~:Yws#g[O#i+47n7'<?/b_Us}-[|׿_~ei[x>;,It9DYόY[:ZGOߓzsvf61idh+"Z'\4:0YFQ'W˺ײDvM1'涉͔QQg猆d7*ZB̄EgxTHBA,\XDh2l8{L,"j1}ohvV ,WכvY՛SfQ_+&OU- 0d:42s} %qzh\P&#/!97E睳J`@ZYVNEzI31j ͆G$sź!tx)ҷKf+@Jnzr;jvyfoMث Z_~#8WCu+%?ԥ~ݶ)=[Toӫ(O~B~XEe>H'":)dyqԬ75|VqVBt;M5-wI께|ss'}7Y:wU`f?(R<gӳ7.e _̦.~[ë{XgD; hMsX (̫QϦ|߯].VUq{ϾYXO7l_W~OݬW_H6{:ʔo?cNj HƓgF 5}zr6Q'Yr``×fɘӭ߷Sb|[罭{_afűӹm5s弶 .Fߵ5cS!Ǒ 8H%ˡ{HCrg| .7'Þׯ<|S.ߴa |Qc[֍NYFhMI{Fe/*̼0ާ}Yrmp]|\_n,>{cOT}7o~`>;llhfO<&پF3_5t}PySjx.%nؠNA;\,^uVY&/.؛w[4xf>fl7<7lEQń[IT[l ƚ`|]BKkV$]a,3KK/-5SnFo@w WF*A C Z#/Hs4Hz挲EZ*+,J/qw,'%%o%=㻤Q)/.<{]u8+R=Iy 㐘y9̉*x?3p ,Rji)$1,S} Kh5zٲhl҆JFxnQ%"L =XļO->"T'G=2p^jӾ4vo ˣ=݂DhQ¢ **F \%;G7CqXl`-GS+4bxy^.X^'R.E RrȪٍҌ;$$" (O Ǻc4܅Q`<.(CĆAH],KASc )6=QBjh/}skΕhh{!$Cb s*eZ 9${A+H b*@"h?a:u9 AIA_6H&w!9][Ȉ\#n p>*˷śY F ?0lp7JAIbs=,lðӯGmхGs@CF*>_5',%)BP/>0fqaϒ8.v;-sY,mlk|7Wq*W)a`U#Ѐ `,wIhLї0/TÔw]lmָeśOpӵX*r"" 0ob\`) 5AwR$Vwf6 l6un(85iatHa\?l kw ݪ0W2_&eoL`oYmڏjVg: ct%2@JF/B 0t3,ZlJa^iGjdY@S*inTd}#[\ZɎ3?Ϯawˌvo7=z!H9R2nރ4ϛ `J  yJ-NqΜ S}?゚FX^c T:upZ6$$o+GN#M')uC x۩̺z25<Ώ?3BY&$̱ T[b \cAZy tNc30|xK eF͝QFG N^T MC&gn ^H{ZzM.VnۺI)қ޳|?(݅y%awݬ`RGTRJm@y:cn?aLNЯe.7`]}og>%ݯmc#BZX1v: &a a_´dby:P"JQXȖXRoCbqC^tTLNs 3M N0p& SN: Ry0M )Pʍ"VFudcZ)"V"L;2MgwhkU STڸ oJ}m@i~6PډE!g"H:_&u#DIK.YTsb4hC>EOQ8ˬ6s >$ LIxc@ @?,Q`|_fJ,HlKYuȿUgp l]ٞb愘#Ĕגjtú_̌YRәQ G.]MX)V06<@8Ѭ`T*nq]O3,q1!Hb6]뾮֭Suޥa+u݆e[0xbaPuj\f lm`E%XZ5=(+cuR)c.NjmQ-f~?ju /(~M%X[]T%UY)|Hss](˩NOԦEc1p?^\q%sH[]Fp?K:{'ޏCKmZ-regw6PϧPl:QDTD>]ENDSUiiU|NÈ(rj{rVf-(%l=MVX|1o#YU + oۆ 8pKuW.]ڏ-gŲu 7j[cLNMF.W񎶉J od% .2VQ`Cڎ J~\]db+-͋' ZFRn1Kba%뒝*B?U\t5UܒUAH^p4RB\fv@ޑP*/?KEDoT?XxSH=ZMafp[XVd-5v&r$}_<'*䰓{ y1!f1bjJU ENtO "}"Ä{QYq1FYAÃ^R6dXD sjep7( xxGU}M㮈+H3oKA#.S T?aeΠF ٠ڝOi:uNS4HUl QQeM(&VR3JuT:amJ=}9~~B ?edݿeԠ~5`4튒D 5 bL``mԀV^%ƟGb]wb{X!/B ]|rp 9լ~)!$YOKI.݋^pnևVPYsw1giGI!]:݋79P{ ݳU:gKkk+pLENHsA,+b pݧvUx<*`)D= RIN*4VZeX!ȳxĬ$l\Eǁ>1/a"JJԇXZ tܒ2M#u&RK7?]!nq9*-n>BesU\4Ԇ"GJEFX[ k0! 0#`1,Z RXJJpp^*P j˖$0 rfb\٦6xҡgS\#zؗA \ 6[fOS6Xn>uuçv27JFq9P!L53!Yq# S%싑 h/$wu> #A"?]`,8B`)18QHxj8FU I%Dh;DHDkQ\1 A*!l09YβMUr+Qw~Y" nXAK``iH(0(W46 Y (^r_Y(1|EØ3#@ٹ_69TTf[4 *BF: kB*P2'xo)VOI^ AŢ|cOtdso*z?Kՙ#eIm#qFHH }-G1ݯ>&vťL&mYs=ETC`J-/4īR9oTl;i* 瓊r\T'zk# V;ͩt(=;# \!m3h (WKM8%Q/% vVQRwIԇ0pIyձt֜OUz+))NɄRpCSyJ%n@+EQG+s(Kݛ|g63*iYH!i#c.Oh$ p~ w]2K tv2dWrjgz2 鉹! 7tN-&e~xzF) W?!xIuƔbd+!9eT0~78 ?M#LϷv̠@i$}da3QB:KӨc$v=6OT$R is1tH"XHN{dF.t jz8BXbP߃ =yD ~Ly;55g?4H's0vB KZ ujqݠr~fZyͽ{.U Z8Uy{} a1c-^բxwtZa#hأBGۨFG\CdtPF%x{1IʙPR 0N$ L } {KDQR IΚs`(6gFdW7n:%Fnu=Uϭ_77?o>0Zs|Yk\=G[%uT.RFe)h>3&e*VsB=S?}ꩋ,!yd!xUc=3Apā%1q4i`6rǃ%i8*eYuۋ͘`G7A0-d85>K _ZPeyN>ghmxL$V2Rm3KQ{ӍOgױ7=<_Ij/!%G'V#zG@R`Ii2V^B˓ ,W´pxZ!!% Z(jp8X 8gw P^;$愔VuhѢ.CCIkn7psx#;K]Mwξx3\˽t3~̭>o1C;?ˏŇ~w7q_(9cT3e]ŋo1N^<'?[Bx 5L◻uŻA,޼)hD,u:K!X v4 /nY{Z&|#Z޾.1o:Z!+< ]n}9qY|7?W>guȓǶMqkm]Lul<:??~omߍjle Uj?_(,{;eC#ED2G5Q l5qY[akDHv8+. vFdVT-5Ҿ) mNkluzʏe|l=m6ʷNi~EᢺlwrxJ9W_bĖzr!|ȱ}no^73tYPP<|R@z1ʻշV ΈZ>)x2QjXijW ySCniA=e47+\y\m!ዷ衜mn >~CkgC: ;](o.ޅ8h{$Xmwp,˜DXC'(jĜ=_QOJB)KAPEf-sIvd[㜆T[HSX #&DĀ)yb"n<Қ^8n-.:%5$ORTJ>dWW>W.H p)$վWdXn|o,ۋ[P˵)m3%4"+dFSQ#"2kjkxT}rQJq" }qF XՄ]iE=??fܜЖgFp5 w@R 4WڇAx[- Y=T].ja˓~g`z,^N ,XjiKd5M=γNj@y3j{"-90ܸ G3nqM/q0`8/YJLg1`nBu$x"NrG'q7JดbqU8|U? ?TO>7_,e="?}~-SI V paQ71% ^OvB~g2qZxKr/u|yjcB;*"- m Tu5nmDMG]Ad:IQK/V Nbi%$X&oE2rn@7ly fz1"|4X =h୍c!dQfov?iڲvXx7٫ ר0i%ur~2˼W_`J+  [EH6)EP,`O'=d AΪ)H-Ch ,KUu{z`I/wRr'$a2Fh=/deykTr{>$jX*I1PYR;9Fj#bdr.*IK-V$nKeBGau5\8VgMd3 ՚1ϧR c 95x+;|m_u٫կ>T}ynv,_ȳh (iW0+n^ m)9hKḐ] {` [_={bP26N_H념&gCZK~0e J7qi)#yĊZ:X1c8B,K<+IŸ^! 4=~rϺ'e$$$' 2!hU.9)&"#F2Q&.62Υ :$Iu&AI !#zj1!15cZ|sGVgOmmZSbqVkvv*(Mw"5i"C>F*.:>qT s;:D0%HƗRMJ=C QxnxcQz㤌 ,fd<ٻ8rW =_E ۽܇}(IYkYguA{,jFGj4ְ5*CVI(HЏf:hW @ӝ67[bACe;d[g08aSjc@1ߦsC=mni3IKh&V>O.0q*r ™~8`v8gYLd#'M l`מkAUʒNUA2*@`Hp }qfdb.:e Ycd,d);r+G0dokgm8u\S ,.ozpu?NNM&YևdRsCzpv@ j1īxUD; u \ Gsz#ζHN`*(̚|\o@csjofLA'X 5VOT`6TC<8Ʉ]gU2jtMn4s˥NVu]kU_rսZWΠ*g%4M|h=S*FZP+i]=unP׺Cսgt\OLQo۟?[w]ḫ`пfS9}S&v=7MטMss)Rz~d1CՐ];uw$㰅eͯ\uWLͭȬqܵ-m1r._=—,)YIZM2ZERPRmβ)2aaGWn3Ó )ż Busb;/GQPӄ 5BchZ ?2se.xmjr ]2sF.Rr<}v\6^o ?֭ms#;<߾R8a` ֠ YqDf8z,y8o]Bw[vn_|{H?u0xCp߼q`[4WL5N|v2]~=>9[N%yAlMJċՓCz{5Y av18ENaj.Kc.qk\h1RPBb 5Rr0u*C50%i2SJQ;W'j[(ZacJl֍׷{[[?r}Oj|ym[HϿQ/0I) FY@M :\sQ E1ʘ+\52zpkJ dH)jlC5)EPޒJ ӡeB7fD.6dejF{./y-tÿ`o'_ it$[PgӪLLLL-QF[A.K>chdQ6%cM .y'_sj\,Nɥ*ػW<YE1Y\Q{ ]Of,*E%_ڐ}dW`T!&|t!VVF}lvj',m k$/ Y襑%( Y[<Wz#ƹDnoD#qDĝ]hҖ{AE`V-N DH1[/}"j|p 8cACj3)'Fo:؂¤ck8fXU'i\sZf%E݈#.LeeEg]u)b ‡7)&W( YǾw~xx>IŶy5G>*_񏲮G$Ǘ~ ~<=?n)f5.H,V|׺ ŏ\li~[^^З/_"*)LP'hJqJO7,`on ;r~~~_|xW~koOqGW_>=涏迿_vJ rtIrFc,Ij1 9]bR(. ;7&J4i=FPGy-;/ڿryNmhmݣqw=ԻzP8ԻC{w=Իء= LJzPC{w= f?X?)f"uC^Sm)\dַM8M##coFC?+􄺰+m`I+2x91J4Qr^C qnZ7vl:փmnvZtjOpǫk~ R<&6x% L r6Q[W:j} I5ԄK u%RT {!iޒ8Ueջp{q ˺A8jk]sb0b84Ccdp !1>BCc} !ۆ0BCc} !1>BCc4BCc} !1>B#j!1>LjCc} !1BCc} !1>^Mg?.O1M!ߑt QX<ÔJIȊaFav~F~\%%kqdCtLV0#WALBĪ  `0@?o`Wض 5pdBbNB#_zr#U75:.)Ú/1ۭ o$6/܆5{?'¬ w1t@7v0?]W_:e׵Eal1'T\{xќ^5xlPrPl=O @vϱ&Dc 0Vc`S6"ج Pm%*g2alYNwoݍU}dN /oﭭޘײse~gtU9+QlXS*FZv5Cd@dkM86ٴu~]5{|WU{$4z*!FBFv0Y b RQUݜp~Mgc7p V㑌h_0UV br+uơu`ZYr\-Hc b,RF^a(܂CN{vΟux8˜%@P I+ eo"MJ&.LY,­uaiW!\BnŅ9j9jK)t's[~UegZ$|j х%W'{6^gXر<3^fa.t_~)rdA&XY (NK"!Nb`JiJށC(j3Zܳ؎Vǁ'=IHGuBAG{6$nn`F?%ɐm3×!)ihmH9zoP!K"8`!QF:AXYvݖ&(@ :=hm3GTPҏ(R8o|BMg>gHG8[tB؞:_UG'NPϭMiOaSgbGhz8Bq/) o;ቕFPDՏ*)0Ҭ/Y.+7 v㜆Tx%Jq(9"DԺYMX4z|iB{oXuf0fa+&`ȽJͻ!7 h:vWhҘ> ӔM#s".H6MQR*5[eaP]٥VÙNϦ3% gPuQ\{CVY g/W-E{3V h`SSӆ )!5?7低^78[SC%Ve.n1E/7xWxg1`/K7 L[7T|_sU4YǺG'ⰕD\_)}r%!:2DL%QD@N\KZy'9rV+}W{8~.u j:qSOvjʬ-Ps>y} 7w;ްl)$\zܭ{HsYȭ*|澪t !upmEq/Sمr)2Z/KAsz)S)BISjOx̜~AŇ9Z-$@XwHc] f#w<(Xr6RZ<*% i!ĩ\Q[J%ʬbN0p5@s؎-eb](ǓѯաtD& gvۥ~7;0k'~uoliR=_+DWz(R'0oIB*MQK&_#$,"##d\nNC:Trgy2dBi5r eJ^T&&)cZ$B"]42b0RL"qrcHwjO82:f G:U"uG;S~7;n{z;D\ eRۖ o\cҏ2Ĝ JPR-px@S4jtxFUb/9>{GuƐK`GF@5 wsCZbB3AGDQf,="u]h}., QBQך]p?K]0LMƃq)9Hre,S%r裈D1>*dX,g$T)‰??%4&ĔQ\sxІSBx8O dL"mN5DL5dzC7Sq/a?ƽk73FYzwE_3bXͬcXgǓo(ut_PuQ͛7E#%Ζ]Z|;"0C gAfeFߑ[ɣgְh <9_:1O_yB]up`<ǝK?ic#29 m- 5CB#:"ZW-6:azC,*~ghϩʅGϫ92n~(fOq/'-8~U_lL\a49mW7ȫ«ɛW'uo.E\&٬{3gK)ۯ-?0A)(3Ŵ[&0 [ΫW|_T}M%huU?-ꬒd8o(mS.\i%R[&9+OFȗ'E#olD]*MP]y+6>8cD$l B -*iM]MFv㑗3fbE,jY" '7qr:MVUJH𐧶h™ B*2KEpc\n1rcL恍m"Hz5a{`m%LgS[DNmT}B6V<5BV.,ܾ0q!< k~Ǧ^xi̓rfv biY呕Io1a s&MT2ytFV(NSVHU)ؤJЁj@a V[XX?NHKx"Ӿv0\) q)(Y[XKZ2 p|e |*PM'XtARRs( Fm5$)L.Y#4-®UGڗZ%? ʄ묂*͏u΀4: }kR>W)ER2mevQy#HOTQx$jgh5$hg7&6PGa\{:4W~J @16Z* LҲ߷ gCԱ%F53u ꨱ 0iJ!B QȎޞD }m.5ˎaJ$|./VQXCxJtv **6(:٠-wZ 4@shRs+`BI @dr*j]v?dp/M,@8BGWd4Bf;uF5`k˺Qx 0 7jIiZuȆ=cBE(R;wMBA[]#LIu4- _SNamB7JF m|R ʄFMP)w+u zZE[PB]Q[=ಬR uWh%Wk1u4ZP) ! Ex&T4v#]7cEfJWuk#>Pd97C3a?%17SL3H ̚` U7R r ֑tA@GJ- T4gB@Q"!\`#)#Ǎͤ-kGRl=_d^i;TW.#{/hZëqߓ.ޗT=#iB~իf !Ѿ&伦#!eJ)MȲZPҰ*4O 宠jE},B\Ӥq!:38o{ܠ`m+f"ҌY7ICQŘ͋BIJB /ڄ{aV9 fe˜]m{YPZp v`c{{.D!"ja$-A7P476BOАf%SdIW=$V*U2(y ֣2:/3(Z RI"i Qk2a(qcGMhx |Y%H Vjx$ۑ ox>cQI,TGW>/A_ŸkG6TC' O7VE7<ָ ȓd}E 2FMB&ȈG!A]Jr AJ._>&!Fuu $$LEQ{pYf}Jh v% Aڱ- %DWH@,!T3ڄ`ctJ߳yBP΀M@Gص7Fu& 35)JL `?Aj8vGyPgUE תCIeXYv wMϲ] ʉ6BkB L{4C:iˢ9i$\;-,zwm5Ek)DzI(ΓDCkBm1&&xo؈-'' 4 <%2`Crh[SP.O(7"Fh>?(tP"˕,.TP=ʠAUbFz DekQM] [-+o6b]MA57Z`' +U@ &eZPʢ#Hmr3u#z!ANi,WQBR 1tFͩm X{uŒ+5Z&zPIQ{s4/Q4T Z W-ECpsreA^ Qє) lz;k*>@  '` ae+J ̀| =ʤh]H 0#"U2'Ik)_qކ`ݤ>jE#VnD 0r( ;ff5=*YR!T$M %:)jI-0 YQꊆ358 N}TQ~Ջ6B^ܼ\^fH1gwjF܄Q[{p޵>|4$DY)- R,!,f] k { r+O)Dw' ³"b4 smx6$ 'h:J0 @QFo#@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L}$PTD DhT$s!ɓ@R,@ QI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &>_s"Ph|H k!`3< +}`s$h^]0 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $K kω=ZlH X'OʠI oŘbI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &b!noO^<8~A[M鮇msfC7pߥZ?(Ms!{dtq0d/7ht"GÆC1^<>[(]j_9~駕}ϥ^Ĭlgj3hD%)>9VGY/ܬ ż kM_/᛿<9m훶/_}q;8]BJ6MB&HkZm*{6} HyE(uuO:e~%FCXXBuQ\ma`Q,ŭX_+Ba5:U}5Gۿ8*]$>%Aw Z4~ .n[o84ZZTy8]e\?8=Rjp f1!l;i7},\_!<`&CO߽mi'X/uG}=`\lje3Ca/0aǃ$FoΗmy#yZ](X`m>}_/peH:ED'XCY>-MB٧Էti/_\c.t K.;|>[6:dU,9 펨6֢bH^BH#FosΘSPluyk4.bk;Cl^Zc *q?D]jW/ d / X^^mE`ooPls#Ke_-/Co۳:?r *G.OoWeH\bD^9(ѳe\5`s,kJڥbS"7HV;1;? ;VQ_*1TG4YR}9[eV)41?S{ $ЭSA(=d ĈZ}歬B5cJW"iҠ"]{0 #R Q>s'eCܺʞVt5DU ѳ#^ʶ f*LËWȟ5~lgsG ~v7m]O]Ū`Nl5ݚ o]֯[_׳(௛Kw_i@ T~E=[1?u{ Rï8Hzcu81.iv}}}B}wݚA֖y풯Ζò0f˃o3Ztdž+'v}/u=h9r], hf&CbY*'/śf:(/뗵0/oZQ[PRHK״ArV6z@Ny{Nl՟&8f-FEF̵qPW뼼+mqu/~XseBdRu=2ޤZ.]CC~bn1"_?i~R|*o\}I 51jvګ5iR)Ww0^)lk-t{y4TK8:Ԟ[wԈt#A%GTAڽTiS<`Y٘^GzMe{3tجIg8( 6XS{hQzj[z:+5gm IyTƤ(α0{tM鬎1x+Yo( ߭sҵ|wigeG7b&?6ͶY}$q&o/O;= 4\Jɰ襜,g:duY`h2COL'oB<іVjӴY*^]T%%>f)Rn$XBIWXxȄ+lOs~ R6` =A,)*9e[%6eNIlUQ,EK^;{|_&_ ]n`aӪ{ks~ w[]<Ӫa;nɺwz:[yI{;rcѪ; q93m:z#(s1q٧2uSI9{hI%e-gFvskBRmo=M{+3ovy%0N7l ]y>n.uWn؛pޜ$G0 Wjo5wIwu'\-){nh~z>k`\ׄo!Y`?mP6E"mZ*^-u۪5ݒ j+2A#+m\#SCH]LK;9OA2XoL̸28;ḼqIe=)E ]w^K= m lѥggtfUeV}m-퀅ӌsꘚ;b@#{ IB],)(3"R_n 0<0x.aFZe4`@RZHc&&cT>#z{F`V0+w+TwvS҂zPv@%J˕6Tj>O0L>6#ld8g~8򤶭s5?4&#,t~z|6I Du_Y`[E">KR-kfԕ] 'jWL'WڧE_x/!UOau vKmoKiv#}c0|}q0.^6xWO°~:\h>'o&^3)ޔ^73xԹX \bqz*AI#[?uG] Og\a^C WWJ(, }jeϻEuΦ)u`dw:L'ysFǬFg8'GݵW+y]tת[`ί.~ З:y <*Y̧Wކ'̎ גKOØba ڴ;T \o5߾|m/Y!+Eh<;Ho/k?e"'i`ս.U#6'#אfYs΄í*H|S~_5 L\U{ǖߡgd{k9I;nڢaO*ui\.o}IsMLݩsZ},nt4tcنβe@OYmWed6Vw:6v3NM5*}sK_^LF7)UCTcS}sڕ [Β]Oku|rsۄ{.R{;@;H23Hg\x̴}'*:r*DacmŭǛ'wF3S DZ !`cFeiXS|*vFD+ .,Ձ9җ琚+#g7<ÕJ{9$uF 9D@' 9g)eBK%zxZTЖl}^A[p~xЖ(#U,7Ko0maA㉄8HObiX_&.*@uD%ԖhQjc#AEi_Ho={! j軪t! r>gfu1-F {aZ˽lzXU1Kl<2P,/aQBƒE ,zb)A~p]q 3M N0p& SN: Ry@09)Pʍ"VFudcZ)"V?"D4^PS?2X_f@u#DIKBƢl 6*.5^il:C4!ˡFcR͂eVV} IJ'f-*@~Z%f1;,,/!&|mhUF{oMr&[7l@1b I}s>:l~ ė2O\%N[Vlɽvs@?~ pp,ᢂU::nAr{ր@c9 l}D>C?8$-<ֺL`+SE"HE3&j[b &ec-`Quh5}گ 2( FKIw&<g^˳^ ޳_)zMoW YREqZ~1ǶFBB,ZZJL)0"~ypy(Tx- h2W.`A"\ TYQ` AHLOlVSѕ%8߈xJR fׅN3pZ9;6 އ ޖpmm^='V q! W/x({-aњPou5Yl?c^Esk4ޏIޏTNԨ9SZ&>QjH%^6xUbrIa=;w(Q&9%ca %X[ E`L(@Q2# ɥ%?2>Nc⍲@۬i[B7zf0 ^И@*v!95ql+NWooSw«M!xa9 "벟J#1_}"J cvz#lJ|:SLB)R\/rђ(]%fy\2KU'2"l{%?*ńr#@c 0֓5e(bS: ݋V[{j;C#W-w=[}cZ粝sbo;9#0"!M@`0Bp`?]MW`i mXE UF"xF.>Mԕvқv}"ە2辔&wMvb<#8jY)%#4iPZ'J e}Ş~*P*n H/ޛuZpd}Gy}ň;gc^+6G?:b`XyM>jb w,G?^$ϼ;xڿ1YDhLUh ny;-*3vef6Nj?*ކnRb>I#s:ny^](9 ZXaDCW{D[Ƣc\L>b'aiX_R<VF2ӻ5|Yx`"" UƳr|AEu6Skȷ됡.YY.Jͥ<'̢ O;`ЋHM,.擪x@ۿͲ$CHA H/Ѽ,jXr_|_K:F$"_jY"Mzɒ%%_囥V0#*)dFR˴kaw17f{̂˷c`4Ǒ#׿{"6| }cF8$GD;dS$GM4A$VUu7*32x3)-bC[P"GYIO%O;w!lOv'ś:FcH9+\]9 p2`XbsyrgoIړCb]R9[E-7ͺD*Jα:4,hgK-)gȥlNŁ\O̮EdH`?6lW}n5Գw~ _:AyΕ5V|#%5Ay=n &rQ t_WК jU};* cfH)5gR:4MPkә'H ÊRIYM51yO6m\EDvoa&r` *APt*!D<hqI7 $@}޵R]QM=ƹ [`0 Yw|WE>2K%ˎvtd7˥{ۦ@7Mf+5H1 o6n 8^޼wbP+ z,:+Q6q ܂7Rኳe(:gj8j,G+ ĂT`C,`"3 q8vuD,4xA;.lQ~Ai((%F5j1u4&MCIA8ҚKz#6 iF\5$K=qjjE\2"HyFcV3,IIsk9Wj;ȰQ>@OgY16z:\#Y9s8bj%K' /%C 5WβLv-z霯Ix|.X)@`A(YB3jPjHGl[ˆFl6d3yæOSKMs;Ha4GayG\\K<f483I(4:sb|tu@~|ǿ~.ߢ=b?N>:lw}÷?ai;~_ۨzr^?ޔ;{re͇+kx_wU%4}'z}ݰ8懓՛rs3C esrwfPVgR"Ez36O~<q|em"ϮG|"Moj\-%ALmp6ɽIQa_jĮcL r ׄ1(x/S4MwJZsˣkÆs@ o!"8ot~yt{of>AosEb*tI bU3FT64=W"IA@%Ƥd:K5)6?=w.b~O^вl@)ٜ-yi5E*JαjU4,hgK-R6N@r.'f" D2Dup~ ΁~߾7ٻ UOE]q\وQo72[\m"nH[@}@~e PŚ!0h TW9X#5CJ9֡i҄Zà_BUO{e{ɲkiZ}#rF-m<٫:Leŧٶ_}o;1=F]4OnAWalhlE6gmEq=(GhE!ќX*QblhL}fA%0;c#w!b ډGp1gMK(XNiTsSG7΁b7tEoRH !\tIH3UԬ84jg,.\%YS'_[nJtS\<|Mu rJ7fw[Zi8+uφt8?;8&peyWJ$t*,,a(_5Nt$Jw'Ckϐ梓/~<5O[wnߠI?pMB}1&) O*ӭع2 cS@hҾB̝KU?~;.O6ʓ#?ʇ ϙuN-prrzO{$_=ӇTdo?R}2~l3`v4Y+Iw<ȱ#]{:sΞa˝cثnjk7&}ve*@gI{F+D>%Ю0`O,3 L&`;Q+[YRD9t#}_Q,٢,t,;lc,_]hmUA#z9qO?p/M547_)IJn<u )xn_]zyt9}zQuä17qozBYEݧݶf|Mef^wHkwj3F7LG弶70 WV%i/NyoyDq*~Vk˭ͻՎ: 3;ǟF+|kǏ+مr)2Z/KAsΘZ ԿSKN=.s28=y ڃ=$ȒSF4\0AqRDVQb͘`G7A0-d85>K Q9\Dje_CJǨYB|Qӵdށ#nCݕ,je=OtĜ9Ha%(C)8s@t x\p)5PP@L`g<0tv{zz8NR{ )i>:YǵU9- LJݼ.j5A_Ȟty08sH4BŢ*R|==KU*h>M))!ƗPqVĭcjCzY퐅 Y:օ-r@LY]b5H2j>ywJ:ᲦI-2z`lL5tx6W3@n%[qtSmv6&sزA;R ;'6I8=Ce4Ɲn+~ D`-l5enm nR*D6f2^zk'2BOJh5,1(Oz'՞QTFe&)4Ni<lkIT2Qg|)ֺaQ=e 鴬`v1ٗ(Ɲg8߿?=*#u=v{+R7ZaǶB q0}6HC5o OD "+-MT)UDyShln=nA;]4oX'4$$G@ B\QZ⎦y*ƿKnڷ65o4[^ڌέZ"EIPGJ4YqE qt E$InxatEl/C(oC* ֒ ansc!dMFLZMý8=SeQw3a"Rx"C.rn(_)u1DI!)m=? U4 U!@< jTKPGɵNZ8Opg.Rt#80˚AMUOyͽdʑgB~I!4GfR#=?KBw@$ hcv'~|\g0 ^ SY NGs6ߴrlgV6Bǿ 4‡r{6AdO}][}/Yq8;7,5 Y > MKh"1\Qd2^u4%4!_ qahޗSu~1S@]Lͻ!Z:*ލrTxڸiJԦ'׉WH5h[eaPSMVÁ*8/ e m[6H)XO] b:m\i/7*~I97@2˱F3L`PoڛŧU8:ڊ2akQ 6masy?s9`K8a8wďܾk /Pg(0ʌzB[诪&Xw+DQOLQq7J(@1TQyryaiC0ի'Z_ﲦ?Qs~-SIPA][y*,7 +5 PVaFƔ4:*k;w^ww:Wk?_wVwu,ʢN̹RFHo)捥޲|$RqmYv Cv6sK̻D39٥7*~#"-tݯѮw_Q%p¶״IҒ`y?2UÑ<@ŊZ:PbpT/$ */lbC瀿˃ӾʺwN@IPIHN8eC.\rRh'LAEFdL ov\J8.p $I[EBFcBbkaZ\cVgںʅɫj]օ76p4|gҍКiUGR}0{'\ac|Lyh)%ڨTF`E! }ž(h<6wIX9͚M,:oPBT "؉\p3dlKN(}78D;ٮ+8*IUt^L#^7jͺ2ċ̥m겞\z:*)ԳL*VNHzCvbb&i@4_p>KXo^ܷ/-%Ը'Q{)$ˮ8uQ}ԍ _+{ nl b~-.|ptQSBFCj]LE/p.kΕ=Y]~<WsSj̒5vTx\55P $بQo7g˼QGysGXҷĘ$XRZϩVP+1R@s C4J i9muPFk!(9)yesX4aQjz)pJs*'tR87[\QJ:)ۃeZ)sj=OעC,bMIC.\8 `!zJOLx+ QXQ#J&cxOQnQ}~a$O9FF+EC)1 :=c Eq,A @О)Mq hm W!$ &M@:gG:ۻ\w%b })bz! :޵6+翢Ǚ`2$e9ƢH=ȒVgOՒئlӋ̺&Uůu@Wij90 pyQ IYЄE5Y{NP!_UN$.JY*~"@D*kA;5iHdPD&PԊQ'zҝxǚ u1+s$nD2Id4ƢqEJFiKRvNBR3 5R|!CN&V"Bc+9Kiecaj$Tsoz&9:E=S2GB8T3Rga[Q=`ձpG5 96FᎨI6i&!t $!wz%.뭴<ξgnTrEY2xtd5},&]QBfYqEyXW#BqxtZڈ@r0=q/R Z.!YK` MH+ad13%!V)£b1)' v-cFm7IKn<8M@M0J+J.\Ik_*fҚ)YhZ ťfJ34؀ѣ.bUPs<ݠ^#gQŅpK2jn/M1>oՓ fEx}ey"MY%eʦhЁen\EDp=b ː"REf)y, s/$*q핍Ն+R6|j:n2y8Ck1]Ƞa={tH( !~/2؝.mm;vLm&*\bAn|bL]^ ~b.oٗZ7Ժi,[os<0vH-jtԺۖ^׍ww^i{|vz(s7]A>4O~.Ek:qW&WMuzu[[O釄.Gj ~6Ow׽,,e8wH ~<^zYA]6iqG9DmY_AF>tzvP8q*7W~&FPRQ7MT'QKk_Qũ/˵8u\.tR~oַH0ϴӿmzqZ!8b7U vN\m!(lWY/&g-ΘgZțAнU+{w 6ݛW]9vv&*~4&4 {ymq8e|Kp1?,̜ݗS2o:pݺrˁg"lk]YlXqZwFm=8_IT\ѴSנrC/am4Ot+9 jnxÜyM2H.`^pC6ʆ YZG$ߟoP ߁9| A;'r8{#nL0Z`6.$#Zk: *BP`QxFOJU;ꟑ..*nr}ͷPGݪ򺹽8f΃)y`b+1*M0"is2Wu.ZU^*JYn< qˁdCCr)i,jW9ixA D Y*I[1Ӆfuvzg^5Hޓ&F\n kQY'l[q@PywsK@FVݪQ ,9s '$̺TLDOL 2JBr"g4:!}R)$X!%mL*W=&Lb.^*K,fmFyD(.0j}Tb+^%!q $_ʮhPz- $g&y1nS)f^[J"sŌ3ϋ ? 8|x>.Fk]UήI0~6kRcmi-ůڱo?sk% QѾShdV{gh~Nby1;"I&aIL'ݫuUl}k8u;@ 7QZugq149.JUf!IۊXiT| ?~m\\&p.rfzd)f9!:tbb3.);9z</d=Gu!(b}ȼ Ffn hKZD//`THT%O Lmv~Xv>>J2Ѽ:jjcofKˇ_Y+N֥XnX~A-҇D =B{}!=Dh >@0|0i|]hrOQhxBlJ[a36앤 O[IAf"(Kȁ?B'/WwrLj_/L6_O?AO_;/W> |LKKP6,2fiO*6 S0Udҵ3&CѠZ;AYgD7k-65W[L_mu:2NF˾ZU`L1jYP]55Zv9}Mg]7>vñk>{싷r܍`u{v+7?+2Jf+jiTRjA)(LL']>Ls9u;Yg9]].ǟݝsOTJo/_-JO]Z5k܌z!rXO1?U*MUe4d=v$qsr zdCR3#J1o#0tŘה֣ïQP*@8 L -Qa )o};ō2se.˔?<㪲w%d:ESʜ$L ,T9((&uǽ5 Llsm~%8_ݷ^}ow}T9%D`3՜Y+V7hqjCLb,z3)SYUNqdD!Wp)Q["wub#>[Yw+od*&IV3I)9uvntZZC<<Ӗ|np٫QWgn|avb7<wĻV~{>'g ^~|eUVexL R uT֖uE$"sXArQ  SRuЦU&IRZ*@72v~dU͡eB cbDmm9{qx9{Ǟ}8KaHo!?:|ZgF ZyIl98T:,2*KI[/e!MB5UƆRBmRSa7x,Rg<@X$Ūg˻1F{7p'We]k<ӏnͧ 'Gq?wNJ?'AP׿O/F &s-a?}|ENj2ek~rÍ.]wGX__ KѶ_26&?'o`4&?lz]i&''-8s6o$t\2Ks:kkkšъ5Z:JSyue<s1O,F Λ vZ0^*/&w" pI[&u` H#&buCW%O{rp^6w;m-?>D?핺zjQpo-6iB,7 0'_4Z2dIf8Ս֥%=ӺNbM/ۡ9_Fcj2o[z¬~=m̂j?R 69^s=<=_҇p ^4o|O-1RWW>37Y:)al<_/f2fX% 玉R˜@>.Em5 AM᡽@G2W1hCC+ 5'eH^=4LԲŴgƆ"ML~6*n oe'^˛hy^| uE6Q' ޫixYR鷪E xed&kcռ+4A2F$GX2=2Lj ujOlz/ }YI{\I -/?6v8c_H)lρ8kꔁVi{}R~mP{׈U`qQa:ġ6óߵӲKdoHZtI]fRT6r~>zG c黔+q.cլ80NgāVO%axr%ez sg>mb1}xU kE|;! Bcfy/&ƪk>~{ @Y}tV6|N٣wK[.^\UEJUOX4 g'zmG&]ƒUNCj!nZHf07. 5^=O-5Zxݜ@;ڱD'L¯0Ņn+h-QfW(/$WR޽1ף#eng5*:qV[yF{e?a,cd+ї WNAkeIs2()zCZ"&V )ɠ ^bv_?ERxh؇[ igu/%ZAoìd:B\H ^VAnDj 3ZFZzsΫ\>:-!i-Bm 1fVm0wHc j j,JqfhJkZ7ņzJc ǭbkY{C *8nVR֦&WR8U4:I⌷Efkzii{e [OL 5VHš1M7:&Uk<L}M1A.;j60fz:\b2;Q !ck'NFiXʰ1V #,1Thx!ʮ&_]0fI@Jp<@6]7waT/Y㐤jc%kX+@I%DEc9jm 墬wRQ$ {JaONI {hp+$kh-X}bFRX 躳RȪB"W KX@b!^F F׫ek eOa-OX e׬|x p-)sh8THՌ2^h LK`yUKrJ\2XƖG-S5 9[5=JX8HYAw +CPk=&,>b[m6k URbb&A2<`& g{ 1KdP&fF/BYuT-ELu嫌69 &+A Yef+@2fP57]\R̈ 1$ye Db({ A$ yX&X@꫑TM%clR"Ɉ8Ϝ[cTm"(Ja`&`_@A. $W L2Ÿ| pȤ``gְDZX"0Ⱦ"2i_%'jJ ɜb;!p7GPF[),3]wô[Vuwe] % ˒2$U` G/U*k Dhi%E0PZ6 jE6Aٴ a)!kE~,r)"&&23Atsi/w1.E1#8ip1@T26|E;H m%UYOā!]&dâhk?%+r5y:6=P cK/+m `}627сҥ@r[VASc2R&;BLpɗ*,R έcE]00@f&$/ j1Il'ﳊ("E*I &YݵkﵪV%)|"L&jZV"a>X\Lt~cEmLm|QHVǃVvQx$nEe06ॻ Wg*YȩՏ/X?ONXŸӬ(an2JRɈ>& |җUt eL)ڽ!x*#m68Ze 5>ȠW#3K:v5V5u@XK4"hQix$&e#s4qإXT$ frP%&JI>@2)Ce$b{c"Õ, s.}^8,*VJN$Jjƚv?t͐vi,\#5Є7T*m-AEXY6Ҳ=kH̤h. ft*2%VhLX~{ynfFh(W{ueh:)k3 `%2MW-d0LB&SiGѹi0а)=k$E2^Yhky.j 0灣d`M{?vp{C{oUf ]ÁwàD,THs1%1BK6wŹDKh wPzTH>Jk$=|PP@z;Vfu&id j⿢+Ep |rp5r6 %$ Qѭ b^ة %x FJQ͌5zճRgc-pqYnOLSܤ "e~819E@KkLJM-!7V=_85B'Eik&ĤoOՒPMIB1XQAp^)M mMg TfvGHN)Wxc]X;daP/jC iDF9|LJKFx{ \5a=Cl\p3zR [1D…`~cvN1rI"WŪӥη&b!] 124+&pIDaŒ.\NMBF v^Bר"!w*;c70&`j?®oj7B8{qtzJmڄ C(30g~uQJJ_MhLV k iB鸇?SQwcQǢ| U!%@m[ɓ@mDMe'-zHFH""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H "I ô/jK/Bs1$Zkɓ@hD}$aH ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H "]ZK",J7{9$PZ/Bk?yx@ @D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D}$@h0!Zs{)$PkbNVn,@ Ԗ==H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""Huk=gފ^:i^^_o_P w*%Wݟdq<׼ib4nl $et]L[LbS.'U-< g U;1*>4r(~}J9cU( ,APb\t^g?/|>gGNls2h?oy#.ߐ&&%og m=](s(jA +xVMCH9@e!Bî*Plj͞(6qqq:qDKêSR5A {Aq Fq8=>WnEռnI NB)cdZ\Z:x^uU]_C:G+Y|zޛ܋zE^kx9[$-%:H8A娆6~{ iuYIJB05Eȶu:\:PM՞{ν)6M 85ީNh+8M:񵪞p?tq09538-5quw/]ĽfvaO^+_q ~QWo6nֵ;׻;˯6d*wz;f˗{\;>ge%\۳HOoAvU:G{ȿNO-Ͻ>&J4-_+God4?Ȥms9/[SP&-wPC^t^&g J'TK~}vvӸ,~Yuo~L?MK⵶k ;vn=*o냿hqC;Ց_-EKx<+%&۪ȭv)[RZϤJ#}F'OqIE<S†j Kც.|\},^ԡ[E!gW%:u3+j4eS~]AAY-WH* Dndo`qXT|`ħ[|5'XDX&n$1ˌ;\mhR) Qq3GN <(Kvg>Rt#Ls:Sәt9 j>E"qo0Vnd2X!sWgɤ֜:sOX}\|[sdo} y޷> ֧L]T U7qG0תJĐ^}&D' no8\ä Vl l~>wFQm|uPT"ĬI} !2qB%db"dlK!kۃG|2VH$cßPm5$S"qP(>jhu]mJnb3/e E~:K 5)3ب$P^ܪQQz&:JZJaؚM-OU"(f=L`TNϚ?D⩨Ab9M9.uݚsxwSe6$̚x]{Wvm"$qrcZkڃᇋf{d1b:ڗJZj: 1X}S{rށojz: =/e)kV>j.ܴ)=Ӟ7ӎC_~:`xg vB>(lUUljQM2fɪqXߍ 1 )Zڔ&QqmcW|Qhf*%zo֫93vN<-:N%k ڤ`PdvWBWi)kN}LFq`$6 V`664WpP\E'Y,rM@2PT.^ (b:S]ln9j!%SɈ،(;gDI2Mgz㸕_G{x-&s^X`s} ^ь23bڣјrP,ul._UUT5yrb.1zwƳbIǺ匁QdV Cr3!hiH1%-(BTS,:\\k:$_g5*W.rQrq[Ct#YFˑ'11J8CMYr H\$4琋"QǾ*CO>zHyċNG> &קv_(Qj+$^MaSKHz!˄%,Nb [rN>_0s4>wXp"ܡ9XZk \+1-j+=WeGEcL b&8^sdg TZ*(FrBWfζ* .$@hRdg's`g$\:U (jBkA~aW9"q)UBRq*ь$RTY赕ZBu4BjP+.'4I=cd#f2kI򋘽T"(H"{Qq|h G; *c˫˃ /E{mҼW6EqX0Xet4+BmmIC(5031)iR V#Bcc[Bj$/=$kNQiނYO/Ev X9Ѫ r٣{lAfF!ߕjQk`jvM>&#Q6ۥ((q(}@lB&Z1%tqXxFP/ˡ?5LJzl[s4ɏ((ZN(^c>BQM> UYDV;If`\[&Q, 7!3FT(.୷AJPhgHdɱT;ںpvLl}hn,z%CRd0TR*$[xFi.cX--*"=cSN:e lγަdɶ:=p 82!1LT!t*@:, +2}0J&YYˑ}Q|E92k9fcu* )_ь5#"<=TO2QՁYK&x[(NN"C8]owl{!ǜ!j|ѕ_w/ j}s:omK(Fo(F#sE^*GFZHcSmd(.5SUt@ ꂾ}\Sȃ?Ldsȸ'$+;z6D!h 6#]k8t+)tm<;G ;}t'KN/WP^GEz6dw6Рr #'φHM,f:mNВJ @vYgPΨJ.8ZkB$,"U@ሒDIrdߊ?ZFͧj)594wW0nN~˳r A,FcX .9s]reX-F4цҐ#斏Kd|w]LoWIM:5YR3?NH9P4ɝgD~6M?Lg2}{2^{aF8nơn-b:3σ һQ>0ruNomC72)k.Zo{.i&CLŵ<>pfzw^ Anb?`^w]z`G4 `a/nuuj{N7m?ƽu6eK-twmz|΋ KtoNlK7B;ɺŞ8yMϾYx㫷9mHTwo\[]\~;v_Urmm9Nop1q\FBװM#k U:FK;߳ZKvfz~-g|SsP3"J,{KYz:J$PʺYRLsOu %L$F0KnCBe2F +J61z왨6=c{9sE Qoºrٷێ/?97MGz%dĒ~XV <94C4X% , 4W1fDxPhВL{ I+ {w2+ FCxO 4 犡)̍bJwN< 朲qBsJ MZa>ʂpvd7PH~@9v% $ PӇGTܕlǁ~lɋ.x7Bem(VN#Xιq(GtJ%3 G@f#xf4A3J&yL&hـ4AdxP1Y,Gֶy}2 #(yH}?;? Koɜ,F(r  $p+Pς+cd{D#WLs@L1mh1$PT۲;}R !^xW.c)b*R\u *1ֆ$D J)1bb|>w@8꯽}3=dok{_v h~t)STWfW4֏}>sCӛ4\3wY^ 7Z(idٷh?7w:o~Y:0SO78>;KnȢf< iPx}eŚ8ZВHI\Q.~6\z~m#7lS}i*76OM_RQ.>T=f~ʨ4&j:ʰ85=m~^Ez|jE_?gvsEs<پtpHIıl  C,io @,>j <D)7 g =} [M-}nY΍`f}iE?.~L㻡7*f_f˟h9AvW[mg`R{5=_.yLL̂gȼHX֕O<p9?[<9ptg]ΨMlZ1.Vd>T㉟'N] cY\9 joW<ǜƍy!j V@L l}` .bBsȠLYO!Re8 b*mUaOƺDBΠg;Xf$$ IƘ3FWE =hcT}y/HRQ:Sn);R?i<BgA3SQ8pΏoY$Yd-YUEx-.*LٱPGު򲹽wQG3HR=$X UW^Ĩ&6a*E=9Ȝ2c\յƿ:i:V9{=,zWtTm8rrH6[%EI Z%}, *e-Ҝ7h.5$#.7-sT U$TNG'F@pQix{*?v eDXaf)N*&#&d Ld|"eh=%V`Ycr$!jU0D9Q,nY&-ܙFP\`,_.#L1lz9L d~$WʮhP(YH3T wv11Biw4SLoE*ǧo9. h8lZM$faE.g߫ Uuu>M;kaz $:ˆjZrfTŘc9n䵿DFE-.}ķ'or7E7KNHq-)5[ ^鋣Kvtfz<^ŒtN\kWĨ9}d G h80\'*}o=E&t٭ӦeIT6#Dž|.}'ݤu>NNr(C55ks۟fʁQ}_Y{NYR,l@c,OnPK⹊aD"޾']_`no >@0b0` ]hrOAhxBlJLz҂!;%I!r!A9^RG (0#<8Wm="?}~ gdNiTU+M3zAhvG-kh+А}:[-{zw/[y;ᣫF7+ޞJR6 XC۫,kL0݇UybGN8Yt'1M+fnQXz0}~vp8pNjxTK?7Qkk/0~M#^|l{* _.k~>a-21E@oO坼l PְEA}GoX=♯?xZSmH6.xy1gwr\qŻq3g'GcxX|z|<4nk W,|i*BOH''59C7^<_V7QmdR4*復\FY <eYJVĊJ`W!ꪩJVZ<4$ EP^Z*Q)-jmxp&;Q'ո,F=Ͱ}<0T/rW|v'VkW鿗6-w=[<ƋOyzA [^pOK;5|S9#&a  =gyAlqcꝆS%F/K;c03mM\cߴ;Ab;$;+ͻ*bQvKc-ͮunIl;=%;c5'тP57}YOd^sAyOpn#yGvx(іه(K&0e@G ei34xwcnexnnj7dM9idmǽbC$7IEd| cLǣ蕗k>?;>y2(^c\uIf1WU,xstu|[ꗮyMsW׾~ՋD_9dzj$/AE[Gt64İk=[lICO7~bm?;0jyHGW`DAt!nFh\}|t(8u˯ךl0B'"{5'D2~c#`;[/)eym0vP%nc^1>!}E@b3L^]\$ *t޸m2⪊k}#wWBzgk6o޴J m՝9į-~ٓ е:)=]6Ras/#X{ > ܋5\bnU`0nȪXb-Q! ++o=eb>NA@|;^DZ07Ɵ߸k~|23p;f.}cRQi|DVTE%iSЖD2ٚf}6Hoy/o5;B0kfHN><51/[qy+hU_g]}Kvk'RT:dd4]R$ l\fy%RLZJ$PL$bmʨ^y6 FxHݜ(6S-"4HqsgvoV/i5 U^e`(R*QkⰥ(3\h'Ez 'Ϗ|5crI_n颚0YJ@R:L/]Qh!`NµK—2XɌalJY ol&ۀUk%i΢rQXW=;t*1ڜ1qNTehFf-\ 2E%d7[SRtXY;C7a*?%Uu.hu,QLk}@fDI  m'+Tpo]|MEw&#%RWI9nXӳ;K@h+ ~SdveRsO.CFgE]%A/k.6-!f^>թ!ѿ:hI'!BX,K@Hv'UP:o3X+&8M2:p ,nme/&("uՉܡ B /{aFYvVan4]8mPYdtrt9|$\: p1[$;঍5Kj,(t_XAjIPI"eL䴢yU,}p}RV8LtaX휟9 #"Arѡ^F"tKw<om Ut`unV$ 8U٫૘wFTܶM5xYN";>m w *#O*q>e[}%0+pm i,{Хt13IbVnD%\ \˅@GK^HS@ʀvo15K72#N;0]xdW6̬D$K&K N+]Kڀ0(Y;icm\g 0S"PlV=;#֨W 0$1dEH! cȡhͥVab+,vLkԤ$BYéf@mJu)Iӷފ{)E8jZ7+ݭEPAT>o2T${1dJ-6mM:Y1Χ~5lr2LƒB`ơiFQFr ncJ`:°ۖw4ENiVnYT-G(ZthƤgzhK 3B*&%<%2` r)1l>#F $ѬY.j0)#S :( ,jCB6z Dzkz m6 qV$b.UJ<*@j')E^2`X01 :-FNUNFB֭_VtsPuNc-й'`~u-\1ATO  ؘzጙFj}Ҁc͚ B$o}Ϥ: 0 3YBYKy~5(JS>`joKH؝L!=B:jWgqFe^ѳ//Ԫ/خRj9 nlr͛G}DZ\J#E"&@iD#岊p6!gUZc(Df"H*B'%PQ^so@֊GVV}J @fgV@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X% J  Qus%PǮVJ/Q $}@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X )2I %fo@J 3(^a%З"cg%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VUQPF?J ((^ D}{@_H+&+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V R}zQΣNm|sԏVO@stsH]r~T,^q̆s8?/\˫i{R>]wj<c9+`m<%v bQ}I|K2ÁT OiIKV LWU_wNJD\a'z41J0竇uNn17ViF5/_\b#;nUF~I ko!WK_م,<{@HM~.5&sUopM4n{ aYznFhOV{I&㦼{ß\`jZ5fW}<\g)po+N r>xɛqͭBD0΍UhؼAqHg7ôD)"  \GbVq:e3 _a "#Ce\aeدQhqcONB^źoFcwE}2w?~DS}&+AkVMsKa@x˃vh}[e y-PlFآD:0g=jםx'"~n iVXo3 i͠0?P,sɻ;Ö2DkMތLj_ٽZo&w]сavj̯_g| hlhۓ7oMi"V6Mɖ96G+!!׶a>6xY뗋~Sb%xҬdBf> 1< U_xo|VSQZwWZxk-P̤N=޸h MෟKd?Nu7 sV6bx;/*ko] ᡥvr[ⳇA!l OGc>w=hL>S%*? M;x;]nXUipdܕZfd)I<[k>/pv ʿGq܅,m$Xȿe58yT-t@12/Б#ÓNz 2%PѤ#TC[\O\LU 5s0ۑ}0ODa=es nh к;Xa{o>gj7O~djpnp^gM<{{I17mxr<۷z2 ^ 8ڮkWa!:#,- B!h47"\wh4w.Ӫ=0XsFy21AH 7c{ĒJI^_,[׭H<~,Wi[,> gw^ Pq&(lɯ^ }t2xTM<7Hؗ_ͬy.]:\ڵMՈą{μ(^]W&O9fT4_;2hH>J n%5Vubɕ[k 3|>1.'nmky'݄Xw2[GvGmG K@yԪ=}udyIvy+y&@hn%< "tW"eQ+%P"ɠrnUl l&!v8z_оwa֜mH댷J!}$F옋Ҩ ,gKaʰx|ګ@ˣ&NKWcdV J,zwAd+uJbOܨCc'cSEF~=@׾bCF;6v-ڱ۳]o=͆}֣͛ uh%g5ר (1 ש@ܲ!y辕 ;jg=hOV? ,bu1kK1#%'CT9PecIĒMNޜjB!dGqq^FY_|]ZK*g#))2$+afтM,ѫ827.^y(} td4Mbyɧ ˝ g+呸 cTW`2apmĆ c27* C9!G5;eOK=5OKN|g*Y^g)CEnIy[9YTUXrEX;.5'(ka'd8-+Dž⨏>x&8AXrA&ò%2,(>d"Z]EE֝5_LIK 1t;]۳3c-gv&* Tzgk Wlz2eJȴ݄er%&olrqZOڥh \GZD`ZP0F)p&X )gRAR;tfQ;\*򪖱F}ኖZwJ5[XM2-Լj W=g2ޱ9?-;Ѿol~ .c4lg_.!P!.跁ʐUPXCѢ%@րd3d"U-F"{!)lJcQT lAnǜ/BT(ZWbW;Libծ&bEo{7f52/A/&#kP }CU7/@U,p]m,4WT)9mMBdQRd PdRщE$ΫfUպs=Hr# XjqE--bo27NY壳 *RgOQ ֚%pz[ZD8c*Tb,C ٙBtlPH1-W*Uպs7eơhf\YMJ]T]^x/E>1'RŃIi ȎaOVcl >~Hynw5?0/\0{?>~..-brs ʌRGzDI6!'p#QYazGw4'xr%zchN1 dYZs$bX1dD A0Bpdž`W⎶m!q}IlAH@iP`,'sxtD{nOɩȔP%fxp4i^]X0FbHЫ'Im31=^DxN]fZ_T2쒔q=Wм7_ᾓp&#JyCTE\s^jGdr ϫЯA<{d_% K\['v+[9f)PIcc1l4Zxɀ6ʔ`\2]y!* +}u瀸YL'*ip~ˍ[uB 20tSV?*k$!(U` )g8cL; ƽpZ$@rxrPq 윬gպsuzfN. *>tAuMxf8!J!8+\LX@Q*PS|)0'UJ'RY["K"EU1E0T(I=Ԫl'}Œ:Ϥ w9" \f&J/( h˓{%hfzQpyvs}2$d(/UIJ^Ǘ(ل#Cq߅&BP.IsmQj@`fAfS` %k[47[dRTcUV324+YȈ[PdKQ|2hQO=V`줒USӎj5h=^t4K}H!yrϤТI.NTrE xtd cd>i((G?Q}:\QBFN!L; N{ h +׎ ʼ>T)Վs_yF Y_0,s2U@V,UPB9\ є29pY)M9d/c wH .%"hS(vjX9բ_bAH11Xi)IŃKì**D_ikJd8 Yi-  R3%>.`C9 }CԲПGBr8?{WƑJ T!n`/v1` FI); }_5yMQTK ""xn6HF_dSz{ Ô)U1} g| ] ݩ Y9_ ӎ$ML٤| S*>^=٭3:z"tD@ 0VTĩL .YTJĒ*K("n3y?:LX2aȄ )]R#W8(#SCTE*~ǝ5Fkf X`T9F$㑅H>H0*1тFQ4`pHY3pLLqd:{})SVaC4'n><8sښNƯMK٫ӌ=^+8.4G(2 ]Nb@`QpXdQQ!9A8Bc<kb KmaX!VM###`Br)[HcMMX$ |71,Rg)cOL sg,ðqgppo+8>"ϜvҖ~ɻ=4j-nקw(+k|F_˶|v’H)*qy*ҁ&p^1x @bHFV$Pk FHi2F90}>#>C_uhqf}nԝ@Qߒ>gA`7'Zp6Re+94Ea*&`DQy$R `Xs/):!޽Z73[amG k(:i?{v43c-ϪL_j6m~;fSu|)\OkXeƧ:`v"d'DE;O4y0N΋YۢX4ð m4^΋?Ԟ6 _W}|ts|KVxQ1ks<{H> v^=-Pli ^.܏zmY|ve>'2TmW-<[gfӗšIRvA_~/Gŏ(u_\.Ý뭛d V{y؝c"A,nĂW@,R:qXn ٶY,N3_37`6to;Ee^? JJ&ll^7ָ8Ƃhޔvu6mo7]{;ivWwLSS]~'?+#R-{W /GYcj+w,Lθ9gd:puٴ! W4]Π?941LI{F]ܙvUi}0PMr Ptų sʆڄB8oAr+X5_h`Z{EYҋʼ:2 |z2`縓`2RiLNI5BiIϜQֳ肔nxQUhf_z$£UwGAoJu^D1њ*tztמ+QFRtJ)5\\9WRki6ׁcyŕ,UpC)v-J!b!1Tmw G0Wŕ48z vwlZ.;E ֫2p9*pAk qcAUn95U /_ulmvkknI^$\@4:&uZ EwZI%` Ay:Eã]㼖4\Q`N DX{j)_OS)`oC`zAI{AԽu˳j]<> WfQ"L×kK ҙ2Q;}07: hx2jJHgf |r4J,8wGADdx8qXj"-,f*@Ls (+%Mv \I@<euj/ԣFV}/rb~ӂ-}ZŞsqD_˸K k2k3_#0qOtIJN .!TQec&`@_`#08Wa@!l( ?֝#S*\Fuhm(C}9pWyꋳU.=CjEZXT; sWL#,8ZI}YYWw;%-ϓ~f\X^l(o>C'zc@{ "߽&D[.tD%ԖpER˔Dc2û iԳx=++2u]֯mܗpkmFZOҜ>Zx&DaUi.aZ_9VD)  KJO+Lgu?zȋz;vFI8gQhmp83ir O#aGI9DŽRn&x92J#0( -J)~Q_ֳWoùݍk Q`FsΉ7 BB@i7t+Zs~ fDGs.HǏzCR0>>&JZpɢ%ȨԘ{#A u"g=+{P? ^YYT*ùA#IS %d-*@~X%f1Q`/(8tơ #xBe;=Uub[#E>2ǜ\+n 1bbfz7j<*g}-d.ĒZK-+ r^;9s ?3H0 N@7 APNk@KT >]s޶mrCS\@SV<2&+k$aI [LL1%L(]&ESX_xX5``1fȇOo[o ^&? `߿ ||>+&JM8\n5,F>j Ǭ llW'%ʇ9gc~fY !oWy+)뻟NO['~S.=p5V}Z(nUpؖi"Ap "o% P`@)/ IAFX" 4W=+ 5$zR+ H*)n$rfrR=c5qJ5_XM3v/|{kʽ/ 2>[>f ͽ/hq<^G+ؙHI%c,̄3DYmI#M/8Ȭ鰺tRE=ImJcQ(6cP.)3̂ګyc$nPCմP`xL2B1L.rYh\@cMr|0]F#A$t]fY R,N<&'i6TxjG=<en*Aْhe!TN(֚ 3Ydh uUpƌUčVl,g|Hd B &-QLU>8{ˏ^'صK:jZ_ԕq2O<̭ Ajǹw-!M9dBF1(@ qiǮTf7VD8}~|p6U,\>bZpRя΁%4\:DR[8VwD Ll)B˙ ,eihr\@$6s/|T4WK\@aˇԨwÚf۸ps4|* 6JyNOg|Sl z `K@̀W_JdfϓJ' mH`V   o8d K;>ed8˽!+$9cilhLqELCѯڊw*7ziԗ1d6 ][-K74~kY9|o9G 5!2QAx->!g6nU4粪i]@evx>jBY)c@8NJɐ0ɭ0O\k&c]8u/ֳDu_1y-AOB;UTMʶ,=E}oKqWg^G #nt8~]*{l/7!`3ItZ)7c7zCfH8ndQ6-W[^vidNvc}sc [e|9|ʫQ)zM/f!ZEGg/Kk#>0JYw5 }1hOso9_rzT58bSO""rڸ*u, q6 e/fn{{S/fXoKc׽SS)E%Yb}rc_Jǔ0~қ\o)o1.e:]EZ$і4Xn;Z]DY`Va4H8:S`z2ZDХ(ݎ8vou`j(%JsS˘tON:}.M|"ߝu+~t-SyrmYSaO[?kRevh~x3.e)sվ攐rר2Q='Ħ/R@'u4OM)]gYlqzHOIZT ℀1_ \,Δ\~/LJ/@@AhA갻(b]? ƒh*+c d\`LJctWݜqvoWh^?!krw[pMi-x"5=뚜~(b7?rm`O銰doW-Hy(+YͯPS+n=rXoP>̦s\Y##$}M$Y h^uAOs:+ϑK!V%:vRf\Ku\_J!~?*>M7:%@k \ȿq)d+XV鮭{uwi2l[l_ hfBwp2{4 r(_Lg [1܊Rp.qO c5MA*eX6Y(YdaF%.խqȌPS)ŢTޚt@hpd&/<[!&Sd fD*}"~ЀKʝWDo4ˤ 'ssKGW Jl uF; ]&(3l gʘkTen*+v, v/`gkF㥜ܹ<_1kJjhU(5 Kq@|J[z\ms֖>LM!iB'TFka! XRA!J3ag®Jj]; =w]'>D<@#


wXp,F"@V"\:EpmN+0KfhrF2\%Bvz XO!,97L(w9!HʕZy萢P=o5qqŇк>Nh|e| XrE ܋#[fʠ|01FJĠ*%B>E! &an gFg9Xn5mo.j,xȌHN$'F3o {YDx]BI U3sJ+0шR\͙a H:&KdR`J6 '+Y5qig]YvL~4n`c`)yB*̞Y̆s58)$mgD 4_/# 9!ȥT1qˌRf AejZC2& F EA4iZՠV4g MF3wsvЭ @_Z"I!$ft"taYnX`pUGȾN~X9aa:¸"g\:4 TIzۑ0n&VX~O^4(1Uh(֟E罥B$2L3Sa^, o·1^FɁW% "x=`Du}ec  G=8ɱ *Fc2^oSVEĿw %~ޏn.=^qEJ!+TYsn3znUycz76'b< D?b,ѥXQ6MM&#J]Uxjw)n? hD 8yU8o\WˁV9U/%%Ӡ\jrQT%R!yU_ӵNgJ شY0=ft~hpQ0U.aϻjPNj)P`:\lCCϓiC6ո WxqI먮Csh!pBGV^^?A~BF93.u5l]Zs0l;WZвSZ/z^69~J0wW$!40ǥ=ٹYtY\f=^ ^9Зmf܍5}6Zyp.Vj+ꬁd LӺ5p.*3U{ljl6r92Z/KA1U\J ;wj\`C:I7,e Rk92@^%M Inug.My& vZ췧7͕|R/WyV&e(&\"^$Ay\p)I4DJ;ӎ=<ߚYj/SZ1qH5#F@r`IiV^BvɃp݅>FA|!8.YrxuƐDJ,gJTJzDQxnp_)C 3N&+erI}~8, MfkQ,rT/[Oo.8t뭛 V{qHQqj5 8b+BZ mN\ =:z6i)fW!VнSŞZ`.|(9`Li2w6OtO&ׇ*ifG|FF<7[΁}J1w,퉇0W2Wqn93 `4}Kቇ4ChAxxl`dÄJt}>0ѴӮAeEAal4h+W5.w<0lwq$#R!zPk;Oe}VB{Ϭa򤳵b2o|Z k;<]V#&'Fr:ip$*ܜ2Qg|9$uj:׊C % L2J QhXLW&%c-.FO{1ޑW\܊kK2z9*-HT,03k9iIc,` "Gpќuf`6~.'yfM籺[Xeϣ{V^ =?ke(7{TkCdUo}[}H4ӻ=٨5 Y |EgcH-63|_\.~YoaY Y̬4.hhDM3N9Hu{ \Ho_r7_΅iJԶ7s"7wh@Jt+qF Ո\h.d|d gEr]Q$9BwXD0\i ?\{|*4V;@uQ;Udoqfx"F'pq7Jp%Ñq慥Xg P0㻬_zTV4!r(\XeUefTØFc 3bN=h}vEG1<֨z7拻I7ݍy7sNT5]0w ^umkA1l] ApRrGK+t pls >f7lˁ*SݥgZxV]@EA+mP!$3F3.t9'21iD D$0Y\W/Xj@Kuk ]J9ҥ3T-fv֜@Weu3 UXfuRr%4/b^sV%KyA|HpW2 ֫,揠O7DdaBQ cO<{I">[{'%f&~?~Ѹ:^q0Fhemo} 6eK-TFɊzYKsuԊQc9Ah<䅷&'%>K4Ohc,{7:k "'n5+.%Q-]֥Hi-^`HZcL =B\iYmhm8嚄>2#̤rAxQwebBX J>:} 4_gy1lAm(RH}^_hA"(;~)TDѲ`&}8ihRfi_^ԛ{fK :<(&MC}+m&%ܗw 3S*(6BׇUpmq?N99\fNkxv*ߨ6`)1"|4DX$G  Z1M1R9iﱡuu1Pۿsu _XxπX<ܦA1vxd>EQ?}?IAxvo:0> spۢwD'ThOEvd[Φt Zo}k[[ڙ3!-k'CZ }o4E ƥm +jD`Ō/$*/l ǼI6'Ϭ} L gL * tP#O=lKdъoĦ:';Ϊ8'?jEwq#AbC_%9  XT:g3} =՜l$ڢA#jȪ鮗[ ̀ҝJ͓}8)akŔֳ/͆Hӳ7klQ/5juX/5]n2`pDQ 8`^+I}V:ĺyW7$*eTDEѻDIÚ`5E&A )R H%U@@^7C 6Q 12ɪt3ogkbF}B%+uA|E$NdJ.ZUAd]!{jyY5c꧔IFN*LK6nCͧ]kLZOcIm'zPB@uw;enZ8mSHjro6(,02K]|&K&5)ѢAPʠJ&hP$Ly.o-28J%x{(zrĚљ Hjr9YRs4Ĺ3*괵Ҍc}l L:[_\Vd|yuwuէU܄4->V߹.@j%xҀ!("=>#KqQeCL6Temw#xP͏=JRm*jO0>k3"ֻĹc[1&Z mgA#AqPQk\r"־^:OhGhFhGhG ՠd Æ,8cŘ1FLD%E 34D\"®X öDRD>$M0|hIF*t+qUi^ [3'«F gŇy׸LNaŌ sOZc] JS(6gbL2i݉~,r@Gd|bd=u* j0V*v6$Y7**dVBDuJ$SΊ*7zl(9s,&nx>Y7v..mh Ɉ`e0JXGY2|@x/uWh+sV7*ƔE$HJFчSҰ!)i [0&B(.LxfֳAu߁1WÔP< V?7mϘ)z ~eO]rl'MqE SdtbJAL&h[Tʛ~T:nJg$ak>m֟Ղ}0vݔRTi]:bĮ9;ܯmYl =NfsiKJf+uZA*o(8ZPhS0=7#נ'S H^mv0ODa|Ҝkw5MHɑ굲(`%ȰF{3y8o_ 7)ot1VR'(Yϧ|gqճ6i @fk\- AWϨ:մQjxȷm%OADᔫE=%K5!&%( D~AF&4nJ{v55L/JL]g.BAldBC`Yrs49 B?uISﻚX2m_4pv{-_~a|{byѽ@(vqgݛҢ[r>~_⳱^{U/osY7u~]MJ3悋4B\2j|=Cow!?N>{~`4|읾F{yRb~}'pH~N8;+/f5Y~f\>muWU'n(F'JE7ջde +|^ 1Wpubq@tcuw<[;z|/_iWwrcJcX|wxƿ@Ppmn7b<5oToU'* y89'V>>.ϼ?3omJg)v72%s7 n=OX/뚟R%e<<ov(UG<0wc!{hmU__oی7Hzu/+?'`a}P؅k>gn)<"ZP;dv#ې“ Z@{jm]/.4좈!m[rl5(8 AF&3 (*d%d&feg1]ʒ0[s ]g,%4:#gGܞ2h>/vjN:z"t@kjQ"FL-S.̽"ek5ݧ1nϏ+þ~,G<1ŖZu|X-XAK"(-0r!abPm[? \֐cd%XJe(x!Nfk_&tHwl꬙8hc~ӈ p{Ya@=Q(ov_W%VeL!d;&7FdmaMAT6ߍOv2 + X$ IX#+Ex| RFUV$I%|/qOL((Sꊜ֑LQ#T0CHJ4iZ͠VM2dDL44ACQ@AG4~UvtL&v RED3UbV"$,SMRbb!<-ele"D0k- *SI=yBP!L [D] mci&5VAz#SX# ȹX'cPdZm@o颓K4CT9K,M92T3͵AO.?OE&9=S~HG:JAƶq C[gY r#tΏ,^5!jY]u|Vt#u{6,JC_qSu dq 5ꔸH }_ARlJ-;cwNH# $g4N)bxЄ_Uy]'K\ c? A'K{o˗c`wpsMc*o+64S׏I#_0tBe/8[$?Yvi8 %l^ߍ' EZa]UqNQ]j 6P{nokw^B*T*`(TQt*qe,:u~߉t*Sw^*V6,3_X46W(&sUlt~Z(λ v\Ksʕ1J^{=y1'3+A#IлZXdU9KU0O)xb/Qard۲2(:oÚƌL"^ˈA&ZIUeá->`fc-Hj֩V ?bIOgG)aԞK"pJfS&gB p.g;,xL tRy6gu <].m't"i-}^̌oJ]gS43Mn5wRqj=VtRB&^}Mk (3v2Yu,ބ8[+{Lpj>e'+smٛB;EC>M'\EbfQ4BT  JWZJ21WQ,/q$kS|ao8`z=ȊnWWS@[.vQk .7Mp9(?`@]Ƌ٧µߐ3=H~mPNVd^~o1 o|/U+Q ez1v;69Z$fiw mH԰*1b FрG!eLDc?cgiyOjf˧ tD| ns[ksZ7iQ'_[~fAN3zRǹt5q*"(0$ ,TYe@3Jn |%Fu"i42 0&$4 +Ę }Gn"ugN]y ɻE!,ߟ޼䮬^yWڿ>%([Tܸ γWKEh9A<<3+OOf${f٧cl41>fdR5yF)ɷ ר1oH0<{QEkSl>\\;gR$}W? $3 gp7Kb!GjB,0dȂ:@n@=Lmtt)vC/9>ǵmqc{ObJwvI+3Ui-E\tx3mS6Off~d2V *{%[|;0>_[e_SBwKd=iGw|wZ#t6Pq-j/gS_i8VqZZMc|=`hRaJF|jy㧞`֡aҧEt}4wT'WKNP5Nw 5>P8pń[0ࣲ77aM0q>.(hE"aQke>? u 0nYH=09$u([kjIϜQֳ肔=% t"?6ƝU\lvnU5ӡ^xoAݷ R' ݉Vs0Mu"ҥ $,ki.$1,S}+JW]k+i-D+]K6P]4sRjbfs2 M,qc4fZx+7~F{\,Ap( EWQ1*d*9*pq**dQn9}yA7pخc5kĀʥDJNcRɜRC`QiƝVc  B' L7 bqcٹw.8h8]Io'Vv!bC@$dqS Rlx `q5WTDCLYB2,2B#J1H}Y,1I螐%8 b^Zyih D~2uj Ts0ȓ35%v!Ҝ+B7@1\d8sW>@?>^ϓg8'Xx(89ynzZ8O$,>V ]~>f/Zx(֕KIaϊ\wSQyq5-}sF"6sL/!;n&DɷEqyjJ*Ê`Dh |!gqr4&EXd,0 '4(1Ը,{=N)wY"1"v FV`[%"a SMBn`peGȔA[  BN[lB\:$UhX.`C[ƋXguR23%&/#C{^ď(bUaSY\ߚLF6b/ڱ`Z]t=l[2L?R_Yx_Ojh KoA s#0jqezŽ2ը'@ ʨ!*!QeAPA oi4Xqj9B"WO>_@q~n5Q`@][X4U3%PR!JʭRS T<NܩG 'ׂS:U sۺO} M*vm݊=ڱd(H!vxLU=(<S#L#) 'u8?b+A^[hݽեѥ[]zI\]pTƒy+0;3l˱d? 5qa jkԷKUnc:Z^ `&Y@}:Ž6F~W=Gw@b _ma59d3/olu0XJM12BD0Y67NSeotMTu ‚q~.*煌JpV ~* _ ,;d/~-˴2|b[!uAw` Ez?=%^S<`s-:RCPBYxG*ǹ`1XMG^5ﱷ miVK`RGSJmΌ6>~s0&YqهlAY |(Qs-S xAf,eOrUg8j P[Bn>'C@I04[YOENIDkg3HNJ<>mn'NԀ~,* #Twm`{P(rtVax Ƈ&Ř; c%=6x8#mɋ@hZyX쭛\ʭp8\Lf"&_E>m!Gr%\Ј ̔ g)0HqةY\E0/{ZVTBYr +-}H(n,iХ X/^-͜e~6jAd6]0BI(}t4su| mTciww6ݭɣ6DZev ƍLYU\2EzxX{o+axǕ߅m&:{myΰ%<\ EY,/^#&i#ƉO;Y6⋂pMA~}PFP Α3** OE QR&*oSl)-+tHYVzYB@ ?ܷdC-a^_>볲7vv~$@őWX&^ߎZ?!O# Qqب^Ѩ%:>j^jN*%ԲmDYq0;m/{] J8ƌYV(+ qT<` qCj/:;/țV,y8 7D}>Qn(O'r{rDD=-QS.]G\9j_wSͱ'G- DKHco/~yEVפ;"K.ˬ܏ VUvBvy_wT_jo,/1[~_] 1T)b+t '/d~rVk/@SV\TknVdԲ+S T,64<.݆l8NgoCZ!)Ǐ SDe߾rm}7?ޏG[L K.#Hcŏf0_1϶MKxP< r\03^rYsU ahP׳WYyW-ʍڂ6ECT|{6HuH1US}j+a~US=,\(wp)B0cF@1gz4‚S8H>Uވ.V,<_Ĝ_L=5W-[ K@="2QO.8NS\U΢PR"TYBqc爻Wr܅b.p'wqXL9CBEX-GR HMhFGV ʑ DI:=cM!B*Caj=jX$"﵌hFs+%3Ά'hB3汖 4^k$aAL" ™vs瞸'n*=L?B({F;:pEP ]73+ϦT6 C} ߿L2 ΋a!yrƸɭ.OTGH?h NmQlǶ+{ `Ǝg:L>-/&A45 eϷb!,lenͼLC40q0@7H H21GQ r$kP0Z $- @ww%P~yi:6 mrM7O;<)n?c<0Ri4>͖lXdlyUɪj-:\fyKr%i>7殗"e ߺճփ9XwFOL2kvwz}FE#n4Zкjww9tf,:b*jm}z+=7cyvC{nwrmh~_aϪ=WYvpOi: f=y.즞n'~tzΈғ b[XۃJpM?Li{!OQ+O[KGZܚnLA{FT:5=`}l  ŕ6+ʈP' iI@^ 9w.dz\UL]OvQoB`8v% ˺# -s8oqs48b@q^u 3IEQF L1s{h"^0RaAhUF& \JB@V1CA*IKp3"اLIY,-,xG0lX0t6ad$>! ߹K[aHݶoRw[ʒGܕ vLja|Fa:OԂaIX$FYǝD)x c,ud O 8#<t~{ѳUs<2riyQ BQ*iĖQPqsܼKa%x/y:< 7Lt? K=׃9s,#F"-$"Z,U0FJˀ1ʉ=@=wԋ=ofUhg.CC$PԵe7XdGx !j xPŝ}b?(uؑ l n2L4TQ`7 /P@tyXQΡb"L ^U|X:3#_| g ȕ,'J+߂2: $}W.G^g_k-/[oS+Jm\M2p=y{uHIb ^#4b bAKYfbٳ@)ֳY,NGB`6t륐 YޟS5cg@vhFRuwHCQ+C/>Xn_c2`q=F.䢬)YzQ?|Ge^ߛn6:cU5]|RZS6 7u? _\mс5y[,o;Q%ǎ*&p*Mt4j,š`|.(hE"aQNTtvb (8?9RG>TڃA=5B7Ig(YtAJYR^u"xTɣh㠒G/y8i:ą'/N9хޝbPJ /A^P N@{BU wkGEp)BD#:p9G Iv T_PWT{dQR 7bܢt!Dl139F Az&y1Z}Dnq$' L9l)6~ `ў_SJOJ5@DCLB2,KH=RFe c<8%&3Rt LPK+/$P  S<.XA% a#v!=8qpN||8Knh|`ݩSO|38|_~2:UO|q|m jbB'0.:?17eqgw0'= Gss/aN(Esfͭ$G!X} wȋ:5?zf#'"f3p  g0 ʼn >'*ʱ6 eLkG E- l#0\).)Dcppu)tBQ-Niī}4mgь{ =`}^(U;>.}!)5L牽@Ĉ4QҜ ln Źܫ9 PHZgPuXG+K5 YRYe81h$i`J*M`!`l;A)Ȋ>%x vgWoe_`.PC̗ qR&bw *xHyJ?_m.ĜZs- r6^;9s~𳇟' ?Q88ĩ`Z::nAr{ 4*-xٜUY +jM$2xځ?w}!7YgIi =Q(iDc0=3U]Ԣ#Э4h.;>E=})( c,4H`6@Y*NywS*͓-ƃ6B`J@|ש'҄WU%dDf&ZU K؃ qg}$r6^m)vXq bd/,-o#ɄoiX^׊![/WJ^su PIQjFRoi~Sǻ3k2O/uWa&\{ i3Vr}ۦlޥ]OPBB^,6qZgIcAKY㴭֘VW5m;ͻeNm\~k~'%W*?*G]wWEޝ=ųdz=ْZ~_9f tzy nmT9c-feրnm_m٘h['iRސ֭r|oZ$bݣLJRoAHǯ(++ ϕ+ ϝ+ ,b"w9f#H1&,$t!g-‹t`(#v]n{\盅 7'? \< uc@ydD΢ZҮ#B4|\Q7F]w/ %[4r7Oy>Ao:<~ ޢ7ۑ߼1r6GiѠoOOilh;sqvur:C1yB嚚VOl ɗZT|\4ͷd&&A& >t$gC;,GSq%QQ@$srR覰L9[S$ $hP!E $8T^yE j1ll6'&=)[@}9Y "4*Y JI$Vq>'u% csV}Ȟdp^V14l *~mJdX%vRccGm65̆bVyV{Ҭ&lN.Amآ{ mrlh SjeB6c޻O_^Udes<.z:;^/]jCP DEHoK>#KqQeCL6Tim˱oC%{P!H%kTE>N/Ry1.n5#vN<:%j1jAAf(BP/%#Yg(1`h|!rQ,M[[u 02mvkVeDU[PH$TgG x8Fd ǮQ5jcD|l vY˨( @JlF$^XD  XClG sl_UaHq&Ǜ-81FfR3t u <|4ܕTt ;6QײY}μH8~X#)j-uaal[(m2Pԧ񖑻wEkۗ٬&Ju]* OF.kv^}dzv3/ 0z*44J_a}/uSȱ84qq,̿wڲv|Җ,W䥵*/4)TPpB+*!󼯢MGŽ$_9L5 5Ycڑ¼ Bi)&i=Y9r64Ʉd|^+j]V S~ZCޚ\TcXϗ|fnYpҐ0ָ* A3z, lc&$ޠmiQ8dS8=ȷ3Ȅ8A}Ķ1VF0 A +<0/3xP)=Bې,W0Hc\ vCd;u‚@w]i"Icݏ]?~Zn>butvٽ@(ǿ~vir蠾W?tW|ݫ;zՏH^^]˿ح6<.gƄ oAy*ՏO Njƨxq9]m^B,9H|ÆixotKxG@FwΪnZr^,8XqHx;'ZխN);*W4I @'J̻i'2|}4\e+童d^S-Mb!℁Ɍc:(~:]zՋ6D(8ʹ݋㧦[d!۔9:`vEj]l8O\;LH߸𜮋k̓~9wZ3}2 F!M'})ڊx_/߰P>"ڶX4*Pq"`a7 (]x‡DIDWj'pyԁk;`#ރ1Aeo*s,dKZbAŪae;#>׾C{lmC[uq6jAφ|30;F:RKUrN I&kdDL4n57n {i,F]sR.H(38~փA!)ƥvKhzHqe' KIlAKYZHYZ0F%% і27wDILz 6`( 1R(غa[^9ԥoY\qd [}1u0NVykK%6a颓K4:>p\cx.蜌%&eb4E~Ue*,=4dIF~*j5¿6𦼬Lj}*:fzGW@wgƅZc/|anPV{w|9,-T1x)'ĞQCFFr gzjs]V>j1&E(xr95e ȄD2BdFC.dϦoj!) Y1iɯb>YI%&Na:0nj8O/[բ.̥i.c>g,]f5[a'fYbur #[#T55ژ Qr `53sm "qT1)lL4occhYlr`5h%hЖBe0ue"l"\pѢsZCɒږ`1+=SR:FM濱ufyz@rZx[ Bg2(4 S:)M JlxBƨ(m|@˘BVߘM؊*8šFm]ѝƃG?;N؞$ IX#+Ex| RFlUV$5JZ/xι URB4%fPVǂ9;Y<+pP'*}0yQ$KQxXjE%t &AZ/_/ ۿ.%JAgJ{8_i[ ,fƻc01`yJnM,2giD,**3e9cJVPU-X Z0$8TxhCɠV8e<Db "d~Uz:IL M'E[(RVxZN$U<{*>SkyߥLB,LFz(::cLɄM!T"Pj0ν6vyu&zTo2yho{-HiMdTM-!a5u:-tӅBCZt|UA/WTXK:?73nuldf{fZľل~}Cf:N2?ώyhڤ:Z!'M.;{{<8dk$ F47w>Zwq1X愆etYnYtR_1g}no6`؇JkC- Y1BOC6oA 75/׋n^ξ_4Հ/f듿|b{Dm _Ӣ:fzҘ8ӒX'lgQYw/?킾ѿ>{'owPAnXr7,I^>o'( ;uMж{ۄ2DtFC>mE~Q[]]k Bc³ws`AAj,;Xnm@~6։J:KuyWZEBX g'3{v5uOx V=[,}+ng:xITl)5Oh*dosz?`XFi:üWwokJydNG39rX~?ɱ%JZʪt9En2k'E&m25yK؍}by{{Bq.<^m^p&9Փ,BD]=6^䘬VLv$Z0H=뇚}T9e#CӢ%:5HU >\UZ|Ғˢ=\?6*R\?_nok51HqSZ[KCU![%kdDiL QR*S)Da2j-6# @1_Rw+Tm $e7M#b;m&Tg̘Ʈf`nf :#M1E{) +P9X'&lQCVGnYV1+!d҆IVh4IUxP0wL2@o1rihDթ|/d=;[_:GBF?„[F] I}:]Ү޲Vb5'=u,s.lbqMOԜwϛ&H"hj*uoD $Y1ۢDwIGoamطJ0e>,QڎM5j14@kV+V &ƬW|Vh4vUK5'Eէ@ (хk§tp}E DnJ3HTG.IGV JTIPk 8DxfPy 2Q[ݫ N6` Jbu|ԁn#Xe~R7@&~( sUJ\, ^4L0(ƍ8uT&t/Yi{ h*3[!(Q.(ʨqƂtdH^Q@$6$( ^i;+Qa4uYaFMJhӹX8(VS;2ZcxlukwᗢHO̺01ƄULl^v')A"/ڄ{aV9 Sdu]Z|ou/ ئHFl `9:p< (}rT2Mt%CHh|pL Ρ&qc2 ЃVsF>@$5T^Ue^i9U8ni=i /!!dS3d:B[q;*p d,XN5~T}%y E*ΊfG6Z/kI|ǍU6dpC*K%".Czv@ @ R{pUb3h NmBvogcyAP @IG₞k?n,L$&|X0&a!dEx-Q6 ?Ft26Љ6k Mtґp64IxFDj=o- Rti[ y/GѠMJw av@i/^q5*a* bV-6mz̃vOwv%.y}ݞ:^҆ms1d5jd0u1ĴV$FOB=V`&;-,fwm5EZSR@ΓDak7vS^&l؈-''  A x آsE5* t#"B-| ^Or%5p AP`AH &ШV De7kzD-ecW,BvVEKH\k<)@FA|?C bZ0p0) p#X("#ÎW=U4v{ أt%h#*`hAgp `NmS; "7 3 ǚꑂ=ɗ Uw&dd, D ᪥hsOH:'/GN? vG&zfH֫YcTqڂi#bǍ>N` HFWP ̀zreR^H O30%7aFF,Sat9 LPrm⊙4 V̯^pH`"mM곩\4rUfx\"r.YxY 4iU.dS((`jI*\ m\z!wrt o0@57^ݢ/y8xz<Ssz`;M9q=v}9},5/A'׹ˀ"ҊnT_;yЛMڶUD|J @\)>'%\|@=%+<]kTE rJ V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%Wr8<#% 0z6J 2כZ"+b%WrJJ V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%W 9)`G sx6J XԓWJJR @b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JW d9)Z?%5(`3O^ +Y u* ڰ@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X (ZݾɆW?ώ~\P)^7 ]Zu,/X ܙ{|K)j,QАyD^HlYWQ{53~ו}R/b^cs'kQI%j \7zn~n./NNn.UZԼ}%<{1ˤĹ\AY-o3#:_batRVb;<דvx)ZeZ^6.Ji}Eo/8jyڶ~4_&σ?[qopc^' knVR ج>. f ڤmk&mCuM( #5s27J9ugt ?m^FH!7#.MSts>ҢY=3jeۛyMqZ]I-~Z?1Nayz}`<\QR0啑~z~r=9=}uF%?oNzS!DϢWEs=Ѓ7 ,kJڥbS"7RV;sK}#Iwx -\`.u/#}AXwp[iDJ}?ױ>ӗߪd<*iǰH|Z]ꫫ-uT9zYt1X-u5-igU_Q941VU&x>VzIRbh '`E>xǮ [umk|3V'Mˢ`+U[0YρcaxB|: ~Rǥ wUnN Pe?1s!}}QmnWy)Q̉;#x,FȃlgQB9T5#goѠptKSؖͽQ$91€Bw ݱD\i?B_GM3oopQɲ.??lZpϑ|fUjоim^YHXj 믺iÅw!50ƨU\Gu '3Bb/6e nnA^^^0`J5(o:oG&T(x" )le1ō6leI1yvAxq~}~.iҲ3eURm4Lm2ZN2pzM{…6PVbF%NM_/\ԩSiy3{"V7؂@?lNgաkQ=EZٕ[lH}wf UrjP:Y֢-?:OOa4#cmpUj O̯]Pְz `"n}3?ti8Y)pF:_n|e_p<'p~d[~<B%!TG<1}oXDYA">|zD|5@IO*vc%(SGq/]*PQhNU V۠BL+ RXKZ`H<)7E/X@kqk '.t=adGܝ [r Tỳ׫g7X1t*p۫4)P7[`x 7ΊbEٻ|sɻv`n>]ÿ{Xoh%uu[іyv7 6[aAm z:ԎmBQmz!He-RdLuC䀩 LIX/KrI -IK p&m1ΙJ9Ɨ$TIn<g=guRpލgM]PL_n? &;`@={nQ>g 䂐uh_n*uXՂ*g#tdQp,yh7( 3Jc{ϟt0*^b0(CdB[J5cն"]Ͱq5xAd@SAVvPMv|' TGpudfA*оBZz ɼmƨmEՈ_5Á 8Qr'|hϊ:&P4r7˭.kyރdv'SݝGMgTYGZ*Mj}h2*cg@U6JUE˃w׺} Wt:2򕇽:Irwku`sH{7roSX}9s(k}g`MFنV؆JP?.,tUa۷YU4zx^O۪yӖg/=̧=w;}>!數fTt}>g{<%Wv-zG~;uog*ӢdVo[P4m/i5mۘ~>nn?MQ{FJT:~=`}}g6Ϡg @\$fƺ FxPs%Y"Js*j]/dz8SMMI8SXףUaCX=Ζm}n!{GԸW΢#/,/9R 70o3B{m!) l0{=$laH*B5G# -7d<01cR[6)Y PZj B;@Yh«)$i31P-2W!FS&FL" F989SS'OR0߷.Coy1L:@O3ݪ< @`P]Q(>El^S|>^3˙9 HaeRRm%K1i(fT:Ѥ1xUXO'8ϗyt3fWgVbޖX`RT󺨕טTYȁy0F{ǘ1qƐDJ,gJTJz`DQxn7X,M=ꙛ7-.ݳ3%||MgnRJu A8*2p͆i-j2iTQO:vtlx43b#PSjcȵH7'` W5*HJ,/ƣQ ń!&Jh5,3CR ^NE'fɿ<Ϟl z 0Nuic#094s%QX) EpGCZW%6j`z0=ߖ'78*Ѷ?S i=“UtqGP€ a/^(4^ =۴,?DȈ=I9pV\r L=l>@26 $AvC;.k<f 67iҊklHke5-øy)yĊZ:X1c8@,K<+I ;@'@xaǼIܳcK@IPYp&8ˤOA@rI0A11pI]ޓ\J8.pR$YJ6 mJ Sk ysZV֜gw vʩK{;ޜ%D@i~Ⱦ7 {'>݅>FX>bVY4櫚ךa*RQ|?n~ .;2Y"qlݢNt,oE8`2-Iͼp`QϊWyV{{ yO@ٿMX+z;f`eK&2hrz㤌9Yy-YtHjDC~-=fwO W'_z| 2Y:c@/DI3l1?}̼SFRH>RD/JG/eH)bqO MA$U&*@ 6ihR=O~iLʃ=Ȟa(M?nqU lYM(CWx('Pwq4ŷk,:0r-p1XɢVgY٫ SɲFQnF=ŧfƿ[ X;)}7~{]uSNgb:okg|7q݋-y{vF?SحTuxk~ ^Y~|tuy ñ?n;(lVFnx+}I_GF?V7Txݠr?cT,88J)vH_}1#Egj?UQCx+tŢ~#[GYFPDx}\%,gڪqVQ߳*srq adћ>[Aic6aWwÃm7Ϩ3ƋӼBv[# *Rq\J#\,}<Պu+]clRǡB`Z U{]*)2Yޮ1״2'}LGwtrt0_ -v7_XcD{uy՟8L<&):ۂX pcG6 R:U郉|{+ܬ?np@+ V肎*@VHB"GSh,W4- D/1'1HcH*pOԵv45͚`!9V);>]RCH)S@X*d9_)9Ķ"T:*(c(!-S&TI&2+m5.;jt<$"3]]ڬ.G3;yu9ΖˡDUnYD場.x)-)8iM{ i|s[lwPQXUcd!AMp`Lԥ΋kmAgK"Yr˛FF rQEZ*= *VEU~Π(5X8H͚sdlUUoeB6Z9Gx˰j0xfr2?d#vaRedgg"TpeoTŪR|҈dfTt|X[U?>Pf%tJ6ə _yWl]!m)q`CAu mlc`aPfIX 9FT^* \lI5yl-RӬ9Áԏ50JDlfFD8")"NNfY WQ."-gdb'Jy9d(Jd4 O p07&k y+ )5p{I_3iebzif!GW9..sXg3+7.qLqqw&!N IfY)ig>ļIPPl,RPq`j}k\|!lSbs-.Kz?>i8QdOPگ.OVeѸ eAp7>>R^DJs3;>R;|ǃS=V;|T&%K V3ˆIKZTXJx+(߶T+KSa[Ҧ$0-2*m3 UC.X%(;]ZV͹@!%5) |>bk)8y~.߆5O0ܜO ЫGSsUj1^Bxz:p_(ބ3&T@VUKo,qoP x#YcYx%!Yn<, { g[Y#qh!*Ŭ Vg :k0*#zi*MUx^ߝ W,um<,sي`TZ8$BRޠ7Ir_ꬦsW%M86)ڰA]kCaߙk%Hإ(\K!uo^3-S2Px]wec)MfB%dkzbU;mX>S&-LTݬl@kwxvWnۓ,-7+c\ͮܫ,<}ē'x۫=܋miKk$vXixjb[qu5q(}ZQ䠜ӹM,F/}\!!v|{$y|d5v(LK v0H(D{ AAnT7&=2J &+"|OuN]G=彵}ᰐ<îp(N{ۨ P*C`dH SI( _EJALMQ/>{c$:Jt@ZS1LdDki7^;۬9;b-S>2/R4)Ioa J<ġ2P;_U 3l)iL8o7r L ֺlNԁ@)-zf$_O~2 vW(ɬdvIeT|)U 9*dr`@ϐϩ=6Q][k xP@]JjׇK;?<|_ >3v:jyXVLC7ng9~޼g]$,uoh;쌶b|g6p{Ӳ9t3R/~3 [i1iy})D=bvJ=+M"H=A?񯸱?|vcҹE?k2'|Al[;9?֐zѯ}j(fi>ޭhyԀ2TnH;ˋb^߿[ c^pcx{sXkS&QGl>g8ͻu/u~I3i^k{1ȥXrRՏm >6ѓm>6wR{ rN|PE7x~ @ d#gj{ȑ_idmŷbrݻvo9܇`P$/Xv2+ʒl'ȴ `DnwU! o*[zlی*ɥ(ZTHQk-qi”۟gAF'ۿO&w-J'3##8>/=nwxEp!\,EfyL1u1OKk,J3aEBrS&k0hfWŴ]z-~CV(ϐ/HcQ]p_\]CO*BVϫ{ڐ\8uj:0lqqv6柎o;Sj,1\ۋ~D6#zh{oaJmi$/*}j}O4Ֆ7 [H&nEYhR*MڊDkBPB%|0ٻ 0dI%SJQ[*E$ЦM5Q4ԊµZ7fxҍ.{zl3mc%V jǘ2e ?@%>\ 9{R%,G7 TI{3*Sa3$F8hO$s^⬑5dq39k͂'wlt=:)Pp_RŇ*W\G÷z篙ȯf4ƛV&Ӆjn}W|{0X!B "&C=ZL1MqCUTg(ԔT[l(0pnC1AsZ-.^{CZ;KUToM=c?vӌM}!N|¥D5dYۃy]r?`à  YJ$[3 ƦUUA՗`!:+ ;@N.걍khd (hnM[v [gd+bf*챻sNrbnawRll JG L>2*j9b%o `r¶3 CmEgA|MQaz R]QYH5]8_4mqkJҏM=#<Ͳwq[t1 gDO1IЀNQv]=bDV1MX}ЦL*)^Ø0iB1VtY# P'qn\۴MK6_;76ɪ@ͺCٸu!MJ!k$0:bD\v~>nڱ?%-#8W0=79r[ E3k]#nvS[,ψ~Nu~M%2]< &ڟM1YXzT5D0fhC c₣V1mfxO~;-w[+dMN ZY bRձQ VpD:h vm+0 lKEF!U'0PLŒu_zŭPoK5 !m jta7 v|KI~VKmlf}yb@-FVWy fW^|w"i%Tx 6ཱུy| O{25Ame):٠:72APL6YOo P@6h} :9յQ`TL@5TS;kBuF@,$SCur5Sah,,0aEyDU@8k&$[hVQA2/C.毫յlp9tq梄FeH(ULI+%@Lj w%S#A\m.EZG0Đ xGVT.e Pjوs/QٴQ7`:.az*EգwߤuY\OMj}W9yxˎ~!FJ73S]ՎS]ڂvTF敤.`M9;!9<~ޔHƋA/Ym0zEjvݕ^D1%䩵W ʍ8\n }b/%l.Irie- ϠRPPM,ɾ K=ޓ|td5vQ/Car:S/";t..uK 3ZM QtcAbMiӹ[{=1V8z:+X@B+ R!(:鵴 $à)v]hϪBV |T< 1٨sѪfDqzY^gRhqҝڻ(5&'0RVUJC:jSؙ%&ꔪ'6P`aPv5OaOy3ĂDWȿ4ppmR<<}Ue_=aS:/0m Vƅ__2ǃni;så;>Gz/O$OO_,ņ _n:?Zt7t,Wx qz#/l6 oO'鋻_|4;=o9όàz8l)rˠ_ ?aqnx)Hy:VG@>Qf_щQaReD&0}fwYneD p =9x$@|zv8lQ5oOoks~nw"\|1aqw4чa9Gv xMMvO8;qJG)Iga*gJN $߰!xNRA 6;[TyMqb>iE4Kβ-oX={\mB]HK {!._JN&(Frx, Ed2]i:;Ny;mٴ]6{B\Y{ь ޔr_?rf12/?y@K <7~(sʣĂ% vֽv>N_MNףU˛B\u>k35T[GS^h=B >8ِm%AT, UUQT0rz1E M#mUmyV\$k0;,6dʤנּ5 zu'4T,V!-..4j(LYnZNBx*T0T[Mn5rc)lGncN [F|K-աUQ]tBmuɭ_fv6E6v"tYogkwGu/P Uhݰ]۰Bv-Dw߰vP=:;m:kgmF7G.;bs-خБ9ߴ^/_pI_XpEW";~M["{7͡G^@9PUnL4&oEזaGpvs !W[2&6N M9r($v̑Om-oL YT0J&q rxŔljEP(a8gP,7.%46ʅu}ssҼCM}8pT˸hCG_5R#ҬtJHYe::WnVUDn;~\oog]@kT^jǂW)R$HiQɌܺ`s2G@!*2&gAX9ox"lgYfv7`f3EIQ7 FK͡dŜ䈨x&z-cԔ 9M'';i;5[+BZ"gm%b&9aHWPr)XI; jO >8:X[Bx~ȕ!D(P Z#TО1hƝ8|zv5{FMA9nxߎי97ZjuJY,ƲClyRZP#oLם[Nxlh2m1n*ag[^]<DE(蔭2P JޒE."-@`[̣y,ز6XG7 6xTg6f:\j[.g/hQ(] 6 x?AU':XSkI0[}dS.|Q'{ HX81S*) VۊQ ]t$~9 CDD1u~UtM#/cGmkI* +Q@< p,2r QU'I)ͥA|׺GX .g{۸Wi_3n݇ƜAv X`N1E2lʶU}DZmwY"]UzuD@zc X }Ģv2é!D)qZR!V1X P}"Zdut<y:#33z25 9dmdD QKSJQw #xǃaTGcW l3Hwg+,3Z=[| ,(YVIUBQ.q fX)8Fi)` s2Y&B*s}t,IMG)/[D: 7r}_Mo&VXu=5C+4xP2MC L58)o"|{[ofo~|L?M 6*򯿪bqUW0my@YQDԯэtT41Ӫ먦:DV9__&^E.zT2'a%ZJ- d>u8`/Sm=X_.uJ_rI -IJS Q43`Kš}FD}k{2i;ϓ xXey vQ~zUMa^^SU8W}ȗ#ui5ʔƨ\rQ{"AOIrzLr3gށc?&YsJ0ưr@pc(X{2six %qM;Sr))!jS^P#[*D,%u;f`C{Nc++DVԚFk1W ?h'I  %%Dn({JqUU!gdG >9/~at6b4)pj:_<_mp:śb)~^..`Y,BHWz+CfJ jqcO6)*Db"2 8SU`."|PWe|Pov|>fzݴGpԢ`XE `|#tPCy7Q7zl -BUO)%\3">)jWW)j۪f:jvzhb~SLFjwnmԪUǃ|fKvj<Hn»dY{vF[w;VMtkrP5MF2jl۳#CZezwNms5e -Ý#=}Ow;VsCϫp֛>q#nWy>#ڬwx:.f7[jWm^2oW~Vd-˥ƻ ríaK+Ya,-mZѾyo?NM~^dpgxREE*ԣS ypuľ;1r,gyPa Az6oh6WsM_X#k`Yڭ5yMF8k?g=~hUӢYz>:e2-E#FICOJڪUnXЧ%}3:UB?G“&#sOfwEAZ??'ќt>h7y.k3r Y6sK*z.OY%ٰO}tE`ˆTzeN0o7J^믿VbˁzY N̥Ԋ ͻwKa)Y5xx %KbNi pl.i8*eY(H~q3x<*%"eNeQLU dLr'j"Q2=ߍ6=cޅ$Z(QIy`r>#vq4>N괃(*Qަ/SEBe~8REW'0oB$<6|X6xu@/#,[m1|K!JmrڤTvgy2`Bi5p e TMZdBtEђ F9")НS)R .Co٣H_C]Gqgҏ pZ<}Dx>`Ǥ&gOQ|f/޵9 I0M9γFkNcbS;7x*ɃS?7EWnZ7\oLu4_~MbYO ^)Ӥ9_]-DTS5׶yrg˫k29/1m\= {5sƐi7m`'F2UuR47LG7NY1}G O?/)%F{޿֘a i:kN- bYn~ܹܭfOҴUzbdAHWH't< 7kx^kgkY\w1nEIn2~\:.FQo˶#iA%R4Kځt3SpU*F:rZLg3܍klrj o wr n %‡rs!FםmU'kty6/>OsW]0 #TWlDnX^YV<&]>ZUte=g?L0>~mCj#Ȼ- Wy Az+昴j͞]>;]Ds%]Nrn vQAh0B)7eh]wP( 1!Y21$`Ű^)^,bxY70jܒ∨瘣;41rZ6L6D6V`TE$HZUãc>Ppp/$5 x'u ̧dX^)>/~Ќ3ThKo*sY.&hYMnv²SYaξBu˟j ŭRͣZs"c~Mb!ԹW(ب5 Y |$wx}S"17G}]crQJ{^t~][{ &dz4Yiv8xEOZSNܻ]nm=?+~tޚ΅iJ}O}fo*'Ԁn6.( WUʏoNȔ4eY(Bz!ɻDRM2\iW o`ݷ4MoiKԡb7̍~^Gf1`Ѫ=Mmu'zjJm:.6qEC@hM 5]w3E76䕛ί K#r\[BwY&dOQh5DwLqt ,)DgǙbɯ@O^w⻬cG~/e*+Sa. @Y]10x̩S~3qڸx_jT=][7Í+ݘ Zड़?Ŵ tMJ-c42ps..п %9 M\u!XTYGb/K/@@f%Jbi4 .iZHFTk1UNe9T,$Ӷ?ֻW$4W}MU(_VML q4̒i Uq(ͬե6oƓ-WW`lO*;JfG- Rm ڰV[~b1z0ܼ^ύdRLc6{X ˀX(P,iTd(e:P.ز.괧YoN LY%d!,2L&Le29AyI`r߈YOZd]"+xjOވ7Н͋`>{>m> ёZ`>Mb.mޢ>NOv8=}rSȫR٘;+&u@Lo M'@QQ=#jv̑#I5V7JV=|͒qNX,ӱ Y{3]w: فiQG#s`fQM. Fl2*2AOq52cSʌq_ĥ`k1|kyA.7tV0Mr*ϲn($}^__jgƶ֩$o꺡yvvj_Rld̕K8uKs,8SZqr #Jp2Dpc0.QeOp\`V,C:Ny㒀D2<|x=h eɱ%)H(c-adgJL[ 3҂HR!rYaztS܈ 964$Kfㄲb*싡,C\qx ǥ0ZRꤲI_X\N㏊u L{NpJ|1L4ˆ7d^/i}\Lo3xQU SuR.~}xͩ3kq99?[0Wq̾ eWק n Cߐ)=h7wtWql:41k?׎mKGiZp@ۧӿaHHNg~uOnz''kA4m@oR)mS FjV V(M4:q* MT\vW(|8]TF$1[MYqO9-6 OUZys)开MY3S韰Ntj0n?TOΆ/ʹ\=峳n]nd |^#5L,1xۜW>8|Oj'{FfGsL ӓpm#ogފR:y-Okp?wfJ^=uבc֖TXFl3C :Y^aJ*w% =N-al`: G#}诓u 6DN`zXf74dQyq<ӂiiC.Xpʥ\Ӣ| 'Ի)b-f#~GO:*+P $Vijvj6oeٷȘ=4/wi+Ӿ[UYuwA4BA(H^ , q>1KV^Ɨ̳vM6uzM&jr2QUA3(#ݿla i>7reeZEu];oѣ7Ǜ9̓Z=.Sdq^b3V, i*e7_j=sض<;3h1>L[g;WoW9`|_=AC>M*:bFϾƼNA#%PwOW"P&@f)vhwlfd%~#%͆7UD{L-Ƃ7F:jR`,1Ѯ7l{tTV@ڮ6$jӛ8\m+ɕb]s=IuLZ\Z+I:u\u:9s!`w89~딒RiO+ujُީ"hSaNIj#I_׍8AŰ>TNrt w0̤< &۾{-< )`Kp\\c֍Jѫb:DTa1)\ 3Na'B!vٵ/(Gx~i%y庚2LVn[(d~kj\05 6mkL4:r;X.[cGɦp6~6$NB}곸x\"{ѓaƄ?R4V?O 67a:JE,Υed4GXS=G !7*wZ;H2vW, _]^ʍ_|~o"N8*[_@"ň $hw2$`&]guq ۵SJdo^h~1pκ DK`F=r{E. Vέ#ee./1*Tx?\'(&=27G0+ )B . ct]]M~ iε>Ci["5eaʺZsQr3TT 2M H?O*A9ۨط:4CӂnJ~{7;|gr)6p y_z ^Go~<;Of;3•d44:J <¹9H*R1NdYf*-DH"+RY(B<ܮ]VJH(zҬFxnAXlDM涖`ۥa٢;~Ҭo`_>jyd|2͎L}<[ڙXBBI…և$U4y'lD|AbL2(Z%A(\vdYѓ om8(<|I$t*Te`%L(e:'9 yꄲ!7VIU2:b!be,͵]PBI@]0M%|1!{cBӡ6ĊA`n k:zr0}1%ntdHayZ6Y)U.VE s&4ynj]5V}7㻣]禽&w(GtE(I@3"M{CIudg4r󔶐Sd[%TJ8IpQJBE.xRRq$L`F%3QDCZ-hrv`=tإ`%dd/Y̧dx=>(KƓY]0?ZuYlC>~8[2I8)NIp^Ʀ1/A(Oު^OH[s4Knr u-G]U#>/p^鵩$ a;Nc`{i.9,I(os Moԅ7I0/dgrTN^2rfX_z/ʒGC d@zY$l?@Ϊ`Kv mήF/R7TY|bcR5$$BЦ;R&{r3MАZinNBњ '3֊iwQi\rhɓT';P2Xdh#-{R}_Gmm{8!ӡ /WùQz3mf3!\6,z;{DUENdQcn+m_kWj[G?RC\;'[&,M)g@Hlm8!{ }(R<okm1ԟ ' {K2aÛrA$dMo"rP6\'T*褽ˠ62LF>Z&N K2,Yq{lbܒ/M[b/ &XpJ#KvS%z}( %4e(2{Db)XoZoؠ48WhщCK0 +_̞nVIQ8fp@BJJz᤽Q6F:lM{4 {/M7B-㘰 V\y{[F(`zxq^Μ (K틁$?>inx ^vB|xbڄuP=rulpҮ ^.z,'sKFR3E*4c9Re8Z(IX_!I$Ld^z&8gJ$e'U VwWMCZƬ* # >;DsQ1)}h[Fh+а ŎByֵ %9So8 ƅ/eHJ.p^!T=Tvt:D{i:͛(d =6nR 5̤ie- /I/!%nrឳ!Nw=:&ܛ'=4@4LFp{d":8z0 ~dcZZqOj6yyATMzhvv#xץC|MBC~S ${9s>bC6YMqr<Z &Y«Jn] DR]y2-hK4-9-X^" `\6~Kb7;wpFt!x<Of4 ]4eX:rhm$$&-C+4s^N'պ19.zQroEM55vX{N,L@і9j(:R5!閖y%(zs(uq7a_2{@%!LjggiخlNS;7{%:mn4ЖAu קlMA}Y'Rڛ$:mIJLJIl N"Bnp/M+' QApV'd9 G2n2 ?2K+Q2/ٗ4@W,doyΊ'D;kwq'wϫ__,^ssr~c@H*ASPksٻqWNݙb6U=0S4[]%%v|$>K˶P&);NNꋝ>@*%mȜJd҂'2a;GWZWYZ[&j|3NqSӉ/]z !N(SB#@f_3".<'_U&^|c^ԉNpǶ CvPV+ 1yuP c/<AL/1%8Y%>n qy6t@bW:ᒣ.dL,F&}%UJlןOktmo#~@ ʦU)nQTzNŧ}[^ROG2k#BY[m:.5q<΀tL)^=܏1ᆵ jք6x@'J"GCXVfZfO>߸F+__۵gfʑ QHUyf, MwYh~aiy9{Z'=Z>NćbÌ狼Dc+aEva8 EVH?ec?4nw̃1 ѧExU@aPdۗi |}8ef:3M<F|'Pm_ /2+HKB){2z0C: 5>OF$#UBr̒ Nu8`@ С6`lT/-"M3S@p&J/ 4>ΞV}m㸼=o>qSd?ZÚa!&Z:2V88`(nt"3^-=E j/4;#1Q$g WZ7~?yajOt>K(˴M:x"93(֒)!o [n |(#kك߇-4zPNK%'#:~qr^<58Z@8AyZ|, RMSͦ.a:)O\z]WT// R;iB{F2Ʀ!޻$BE_M9pr( 4\[εOR,q3x0+OM)oT Q 2{^P({H ʙc+uno4^B+ 3hqcm7rHFw^*Z^6elV][K%#k[\݃ ڹ }?@ wȃ!E5:dPD4ק%SHrBFrXνdMI'<`v[ʰuZӃWJLkd9z{wȋ!zwIt9&~f5f:O'27:+/ŏ_^y+t8η ߍ?~ J_O획Lo=r7IK A*R#%ΤO" j~jXcEhpYzE4T02kW=?5Nf7 g0-8GZDDIsCh6@sY5T v ܯpBEƤH\!(rMd$)'V[87Uj,l1{\W+tۇ^9PZ*^cpd)}7GTgm|-k|2G?mt)2GL? uG,;z&cUӪE>MhS.5yFn%ֲh19ql~F:˾ 9+XeMV8O3 =r':.ڲgTV`g +%chYʳ:N;>ӷilYmYH=BΞHAwĻtXh`EƇ\!N1ڠ8#Dꅿ/R5^:SG::]ý"Fs] iəgs\ '3 {0^\)A_Y{_NH5g,͛:JcjS?z!zj fkb=!k9`B{`;5dNKxSCfl1c6XxR%|Ջ^Iڀ8=(A?5hT;g_wƋA =" QF%3G37CA%6f1 R}e82AmĸEX%i8WD eF :$%C^UDkP \J%G :w>:C_ P& )SKm^SAh5-XP4& Y 1,^/]]d݇wGttF$ƙ0HTMF!1T$<`Jc&#H))-i,DƂs@Б DA%8X ߟ aa94,ҴsW 'z G]wg?&#eМxl|h2.T/, :ʅ7sלw)l8^.;rs:3 H^CE"V}H)wG.} qacE&4hu "Rf::%3 a Zk*[KI"1bVRˉT7&@tdcT@%[*cܣѼ#ԛeaQD8 .4Gm};s$Ȅh2Jb1E !> |e'z}a )@q`gѰV'fykzP*dʵ2|&>v0ażc7\pF"#G g.AހB [)&ARPMf~f/j2i*R c/5u_0Ls4^ގi2$^0WQ+MS\{e8]cCc|So_װG 8DBvKJ@j_aouy7&HX0A + OEݸ:T:Bf դ墀)Ԡ\i@* $Xq ԲZєHP >~o#a;.kyJ;@r/wW7w\T>Zˮ 4 e2*g?U ,GcR@&%*I2-1' o֩B T/uUKh]O]fVIƬ{+αt Nxb:z7j5\ViīI,|*fbnYIZ0d7F" 'X[MsY^R|VzjI'a?4',GhVgd:$s+%|fp}g1m Z lHMX…sjbwkv?PG+^:JI.}+JD OǸAߖ7[(糮ݺ:-k\]4AIڪN9C?gRRROqb6}OX'y6&~\09ee0ާGyf^J1PY[Nb,wׄҢ)G iG(rDc\CqUӅf<,;v>ܾܻkҲuz,-srS,lpƖNN-k|CX!z4i5Iw1;;탇[1N ܻOSk+44sUv<3%}5%|mORi^D9.5>c:ZeWt:f}' tk!URE90.$ɔvZa!ϐ-cv&na&ՌgAc:Ym ys5?T^p1__zS,T5qP J.iNs3[ָ򡙪7 jT{Nǔ1;`i^EC4_CZ4;{ZTT6]_J)ꖳlP4jرƩy a O"4LQS-dȽIl7È ePloS:OE|NviaQ)BxŨzr5N.ͦn z$K#ܚ<-)d[rW{uF޶bxz_\L!zҨIs]o 7̽}::QS) p?(.Ӕb+Jr9~y xCg-y>Lj7?'-@Z}6~?>3o(y}qY}R&U\$Pb]MG_0a-b8K[4NBY]kX1qw% OeIwsB L(GXGn=T6QO*H}8GVH0T뛷FȩWo{ mm6cLYxn)BJ$mL|t&k[s:I$n{A+̔bx_xvodXOw|̼'^]ĨLFIQՎl 81[er Ykg`-Co hڔ<)!B|)3f_\'lDE.TR& )@.k%Q~f;p?qהC }mP JۆL[lm/_}0ߎ_ͬv6y⨿L})B +]ZS^;r$/ IQP::n?|`ըdV1.C*E(>^L0 Q0DTʹIY0O^M{OZ[f-maȰb؅ZfA#l5.KxHuRDȿ~|,B)–W9Oh4;p)9t5vf%1:b$\ cǼM~cDϝF^v@;}8`FyGi* @OT<(iIBkkbPP' ~\HI˻iG~w.Q8"Öyy4Byz3ϰرf'ZhjJ戚>@#^<_e8'^îWMZtc;RiC M8&D :~trQ;M0 ;~.X/Ey-ݕ8-T0(=),Ν݃B4 vѻI:[a.'LV;OPE1ULX 4k&[,DvQ֑\RĤ`ܖK!DSL2v}R^ĮYܑlYM0JŭԴ8dVIm}|9#GaK_.yY+qOٱp &iO"/qڗKTB8H*rO\1rU֌FV'^zʲXkNgN$,*ya,UVYڬ$%PH02cWxu{zyjp2QU96OYsx$Y6SMmb.ew~[-Rv5WzT\!󤛑ٱF̏p6k [/+h4L(H 5h5X'$c2r63OO|d E A=W6gвC^prTU~6ERp$Q:m3XM6Mrr$t0:鰻zg0ӠtQSՖefʪ],/y3)QQ4ѽͻzPTRb+V)c" AosK"/L8 pXG+~4'Mr0Tsa* 圽bY5_>B7o~x<^@ۙsS8*2-I1*8b*K/Ի$g3P??& Q(OzVw"z7MPݹ([󟒠O@֘m7%.yֈWkKOO;Cګ*FJ{7o\zEvTmҊ'ԣe/zJy㬬<Pk/ZnТEʥ\+lˋ, L.ϸW(!%s   mp-0@v -ꡬ )G9`K B6r"IG#10CFL1VfTq# e";XɄz}^\jk_-Z%]cs)cNaj b_*#!P@H@klWe~[(Huinr6ь|+_ Rr,ʒv}t8t!zrZnC4}`K)? eO%ee%b" 3N?NtiZ=d< A[e<.ڔ+, CN Σ뤈ґ8,IgP|~EfɍDqE|:FԖ!@(UjF.y;5 wSQZX Gc^{P#ܧ+is\&vsSh_&^>M&~ ơ3rlPS\;.  #&%u o7mX(X91ԎM!d1vhԳ-&d5ܬqP]B۪`k+[ ,Wp; "k-+HT &ؘ65B,eXA&7C4}H# U!0䲐o -GOA&teV*Ut%wɴ~,< 2?\+6zz釷rq_\9:]_u"0iG$N ` 3 `Yo 9rܳ].Gyi6?eanAO.t} m1U:oV5Ϫ+/ߟ!ZkBu/ZEWԫuucSIC(iC44d2@$aBAĒ A*cv!+uclQ*.zgY8}U\!y@)fQ1JQIYśۤ4_tf+`9r,6*iyrUWXI3J SV߯V˫zM+ET=wB.gRosI_iɂ&I~b>8d; di|n !gN<ܿӂw(b:BN*8DXp˅ᮽڦsYD[DBApXv `֩|,p K Q>+J?'ټ4,*s c* swyqj\E6ĵFS-7 H%K]#p\ZJ8m׿xgM|?~w˄fŃzgF*Ͼ?L,} P`Uh'J=J3z2p"qp<P7ZڧJZU]\O&(RTp_],q*!SJ f\q$Aidgi[,OA>%hm?U&ߝyߝIgBqr EHk=N1m2Y~h`Ŭʎ9f!1)e4DRG! 6 1wa׎foRpD da0 YC(%ʫ P2R\KpXzE{}G 3v[n|?%-q{tG7{ $%”'S!RT\]kV 4-M)-0ThiQel;4WDTbG3,j)` Z﫿9ϙn2\km$]l~i=Efd6nW[,˒l^_Rҥ$JU= Ïŏ<<\tS/YT:1?N*qo]-[}nTSXt> )uxiӿAX|?JvnoU4zWc;Z/iIiY%x<ׯuZ^Sߦ7p1Bb=LO|$zu Qٵv8%)망#'?zE&|j`H.9KRlKi+tj#]ƭB D#`,k"E8g<=PP^#ogy KYC 9~M,Zػ Wi5(96C[t/Q)jzK0[@$8m% 0P c,BX-\gX`IއwI&Siȑ(@*$Gmcغzʄ30#|Sl.st`Jr^n+7泃qP9 DQtL3oVpcl:ԧuI_wGk!2eoުHk Sq5(sr@W[+:Ox#VgR7`m&Y$&|`! oEi͎9b W 44Ub}RT n$GD4-yJ@EI\-J&@P|+]w!L 7:8s¡f6u7زbM aECSi+ڡ%ގiZ48k{ w$AKAa s/iK2~8(X2ws&q| _CNPɞE )"e)w/ۯQ= |M7KF_qJ*n*W@" ŀ4P'e{R'e{QY,>MA(V11sÑ(i4j 0Z4Si5a``lg짩@\ў߮*⬎x%?M 1yTw]opb,޴;@#P0n,<'x6XY8x?.F:`cE^H y_7 %Jr)9˛Yf##7CX.VnF 3 T@&(OH FC1؎Lؗ-XT7v8*l˜IȕuUv*qV=>xM( Q1N6.&}kďXn_'>-_~ ߖk8EMkV6Pxhc2 P," DFq( `Ÿ.xr ,P9:$mc,!Җ:(am  HM\Xd ;\θwu8dNe}Qk}(&jU_YkV fLd6^c/!hhc0juzh'pv:T2gA16d_LC(v:[ӓ'֓4 r7Q"F.$. 98@aT+eT`Mƒg3HTq;k~aS*q/2ܢ8k G'>i_<{Eki Q,2 0QrOYX' Za rV!PYt8/Y\@P)Nz=(~['?Hob [pskŦ둪XnwRE8E ·% 6~Rp c0xumfu6SD[q~^o6xG/Kr kS Btf΁+s'H{~R哖eB`\^꧕kuKAA* Z军{Q3JMVI' [RX ; kfےq c{x ,&0e҉QEE_/*Uۯ\):ubaqT ?(W=Y^/9^3kyU$gNSK!6/#^"oqBr'?HgR렢q#q"x@iLwVøWum8yWKL=BBq>i8Qi`\({wihz(h6Y'%?ݗ}[ǦeIҲ*%F%mߘ>U/hS!m<} MgO~8TahsI}_M3Kݠ2gFg*.7}|4İ}w6HY+ 1h,8~1ɫ?vu?qB$G_;^1GyVU1Es_| ϫEMۤ4̇b+g8旫9ҳC<ܵL0>~uvm%M6:߫nZ!\;#ܷi>h& A^gʸ !S;Y h8Q_5E )/\]ޯG^hC:Z4muik&1ś"q(T N^{k.XeTf(ųsIjN9P-b[|(aM?ǒ> P6~ΗyYO ƅCvoAOfȏWAƥ58?PjQ>TGQ;yx 3ݣB@eSGz{Do n4|bUExvQ$Њ H`M`5w_kqxoV1n1x}=dk/v=TDU )ŵ*o=N;AzL#_+wxĽgvvԢm0.sjqpNH.X!*ϝ) A@p"3ڟ9͙Cճ9z쀏RP/<e|ϙ10l EJ0@cxXNWZM(b8[9(UhS⇎žR !2ZLJpa-rʙ}z̫\JAɋ;C7Ztcbt폪eFEB)@ZF: f% 6򐎺`90tMFwv`} 9*@v` ^@IO !&g=W,<͂s0$NDJ{#թ>zRCG񲓼Pf$Dd!UvG Ovm{j.9}*p"~sc, V=eTDۺx=bޗ`9#Mju±wU xNBϟü۳$"Wc.^W: la@F|RƋHE6k!Hd-LU}6VF9RV)]ɏMk|cr猡3V)Rč@pO@0P83"4`{x.</03~Yhi45fl~ c\^Ϯ=E0$Yns5`;G.oeP|PB.xJ NGfr*<@d`CME&n /y)UqwyF9.Ph Łq[$5ˉd-t6zdVrV:r\S7f-p@ E`uj8Qa8wkygy{fQ\iKj͗զm+顔,zb$ C'Z#`<'u$VY>\)G c?lZni;-]L꠿D {b4e{w1Z,zw~ϴP ) %H?2Y;~K( S p_\H.q:9k@ CrZZWHU*xZG16试r9 XlcurP6OfNp)9!xGl:RR0%VqѾ1HSӮKQxT!vF?U8[9F^ ƻgٶkSx}̐ȶs?ZjwIS5{ 2N3p[S<$g]0s\Oe^ L.N)71ƟQ32:>;Tx&īѩOhQSF R)#Bnn`笱uD*dt=0[WXٵhjX:}C6 GQ=7H{zeZ[/7ŷ9ϩLq6ႂ#-3ZᨓfRߎc͐2"W`:w¸MHG #/*z%VѢ~ ,TƘh$GZvs܀_[~Ǎ> 1Zq5wzjR| fw\rNMkۊ -|:;iM6m^UEM4J':VLLWsP\kj +f0-77I /"YNwV#c)G>K-]9eu\Υz$ x) sI$07ochtg ՠ3`PL%HVZ 0!J~_gZoB /k"0&y?gɧU)O裷w`/{姣h+/MܦiK.t x?$&{S2t]z9DW`Ѕ-+h 1ѓfnNn\B-EbF-K+~܎B}RK -Ib'C_,nhHu:OO|'l|=M0Ao0{;lIzCt.Ks#y"'?Ut,ƢNzl.ه 2 ;B-@lA4$NyoUx;`ԂCD2Aߊj+Aǖ(x>MZ JE.Xv?}?%T0W9~xvȊ!Yp;[X$4gN[LO-fpLiz9ƌ9[E)˱M9;k/EJEbqIS)vT:IRXx2'9N\;:Lq ^dH)nN0ԧ%EU/'d8ntk38ʆ8mmO4F`L:i 1It¥%#0G S˺Maá8?q,SYzɉY 4ª` )L^U5;3"~E 㢹UQnR{e4<'8 $Zx0r/t fy8-Y YtLxF3)xRJg*ʼnh$q\g,Fx/* |nނO:$Ơ3YӰ Byiӓs419RQ\ Vǔ9Ir8rͅF)uW$saK 𕆯p^Aen|]IjGR7o~Q:=/#Hl2x#Y!tt*Ѧ&]YT/h]3){Us^MBjU}p2$Fzz khh 5 1~+&X!8Gb8MIJT'p֧b6i ~F_VDXzTS ՛z{'8Dׅ55mQzw1_:BX#qW丅#CkY^Ug_`6[fy&Uhr>%bDn?N )E:*6Bрqk>aDJ/Yi,&F¼%pv\jp`&QNљD0\Eq=,ǫDe %4|nʦzG^m:41s#d_V[꣰4W/UM(ola[@Н*Ry:qCT!DK5Qb-({c=[- VwNBFo~Tn!5,EiKX:N tXI: V0l답t+ DLٖ޸Y"&BƎ{bdMk}`G2HѦӠ{=ZJӋwL &#&ha-ޅОL@Z5𝀃jw/9>щ/1gy8Եj++fG˘?9\۴VC?e82GyJ5qLSb)c 05"U(1r_PV-Brѣܪ\N;Rl7;g2`ĭ*4McSj1:Plΐrv.܋Йq9ܖo)Q4}n%4F~oQ=j7F@ښ*fX,J9Ra6|V1ʗm s4V+Bq, diBcnF)#msssp[2'eP)S7ZAE(BFm&KI1Gcg}]Tۨ ӧ̼8fr#1JԜD0 $Veí~J;M SQ4\P{A|Kk،Q2 KعZ&ʜ, zt<%FмN2cBR Qڭ d7;<@>вZLTvUxDDT*}K4}öx'ke1f_"~*de֏/P9uk/g}i{n$GTv%w3r4iT~lEN :jVlp>dۺmϣ7325Լ,6jTd֨ŸxZUյ`j[3 KW\M_CMd\rN1 dw<+~[6澌jգĄ+dt>T9TS&"qox&(r>lDRνyZX'Elzo9х\ĈS{K~vQ7*·ӡVF4oŬIuN2ZEQ9ʝ͝`8Υjv$2$p钡=L odOS'Y(6IO~U)AJm:VJ(7d$U?OB)ܴ7i,}hܸ=#n(}֞Ov[5aM7[ +e97d,u'c(fgob`zAoqƣ f=Zxн!] f|r,KgRLxUkj.ﴎؽ#vT#Ն}Lcp̊Z6p3(\<EVi}wbIKf ܞ5imP[jyNesd aHˌ"=V8F2Vh8 )#x.rTz'+/l1-ߊoCTMN&h>]ߓ1aZ?qڞ6!Iu2ܞ4t_I~$ǜG؁]> nڎ:q㷁ʣ UDa3Tr O MLTe\Sǿ_DZF1"&EoI|ğd>v_XYofCحV?"w=uC_A314C3 zOS/)S9uxA׊|.wY*'Zєs0_g 9s" m_Agɇ׳ckXY`vW197d-\jax˛wN# }Jk"T@GUU81 azxpV~2Oʝ`o[Nj\/vkPeO]rkJ$#fyrTDR%!GET3_,7-[uzhC!S!5Ʉ2<Չ'RZI :{D@(;6T3M9xer>۝|WTe[RG87c9O9Hꪮ !'8 pcy1Z7S4IIT//.,gܝݞ9ypYqt%_T''EO1 hs}>ޖb#|<ӻYgXLibg-jm0:>)ň+QRJ~[/JfICN|!2U,t%\n8;{{|2-l{~ǽ~6=OܻAĘvo>O͖RͅjiRk<-n;⑆d-yu;)$oN}b ײH+o?,:DxڶwöFy~)+xSwƝʒ[H+ݱǻ+]pvwmOP煈W<<=y~T ~di{#з&=>5̶vve8׊5p.pS_Oņ8ϱ6s(cc^IŸ2?.oAQ8&wK_YVm].6̖ӷMiwwηҪBL, X[ޠ \;9?IݧWMw<q6!㵲׆1 zVǖ!&j]ȋ5[-]DY8!kSYʥg[ٶfv G#RNSE|^/,+ U)C VSrWٱ'oKl~/ox4;a X̌ irOO;&.a`X,IsnI=Q\c8d`4v&6IŘл.{0]|ۥD&o]oJQ<ѣ,E3wyPܐU21xw3b\2c5έI9 go1W kXnf5[~s?1U7A5(k0k˞0ذyKU?)d"E _g<]Ge8}#ंδz{v/_xʅ?9Yk7l紹H~. =:+p|>!}(;Eӣ#Ox]ߗ/|Oخ!gݿ|!㏇}އw?~3.fH/KW?^N}sHmVU];VHЂY}UB*b⭮^ /g];8.d85njʒm J61.gs|ƺ{{XY _|$4ǖ1RVPM)m^˫񷿼"\72_uS!PmIiCl TJ^Zm3tx.Өf 7%Z+Sp Y$8l(&i5ƃww3xe tOn%0me4H2ۥR4র 1*pؔ ƌVyFKs͒bL,Dϕs =gHsi؜z1Ԟ)GpS5pltK̳hNn4g>/f]_z+zcp+EH%?[q8p1!F_,j*c ShL$f_/>P[bʾ5'bC`MM`W@x&WeO=)&r &dW$S!9O"6Ρ0l זUL|0nlrK~T!R*Q;^f2ءPsXY%;!fKJ7I UYxe[-juפKLa tI֢O!R%k*i4ٕ8*,cA=Abסߠp5tqD}AGtua!UM%˵Tt4ErP%a1t JיGGM<, mT3+4yQk8 s5cbCV "&lˆRc4:k>0KlU<<1CQ@ҒMS5+*ec,h Xŕڂ!+s 戹qi {HTA&`p* P}fjU% )vѻ$I!&϶Σwӥqd#S&rbSCa+(SFQQMi4Y( w$SCbNuiNN4g<M:(AYc2<(PiTiH9C Rp\}CP Jƪ0[U}ʔm'4VWJDu-]{1nYlŸ{"Fb(ƽ|#Ou/8ږ^8 û_.ZKQ5CV[Y[Vus>`4N2brqw9Poe3yP|4dUۘTF/U.U+'ȶUpAyzx[h$jU͛lgVY0HWŅV!@Ĥ:Sѿ\YT vҥ:Y/˛^<桗7IsFQ)HOq,_{+Xh. Q++`7 8b\I2X;mCN%<߃>=CC!WK,.K3IJSuTC&+%&8[]Z.MA5zrЬ~R=˙fk5_W-C GI bY!ufjtEdS `ĽפZb 7! Q->ýi;iУGۉcB>>eqL&]Cdy"4AT'[fBY 7C8DLb zK)%P&焏;II{bxE(nJ*Q-6}xYiyK1`Yt4R/& }' }0`7IΜ6cr[(ZīSЉ[;Q=VK6NqZ$=W_.jlh0xEr73XV ,*z(N]6Mr QyGw PO 6S"V n$y#=†UsnJSJjbAf')-1*VgPѻMY4yZ]0(xRћ_ 2ZQ&4#Zޥ?txE̻)kK0T̷MYj/ݔ]:Zϖ޵#b;`[YiganA`8nIܽcY#HG&AcY"YU.I a,J1N7ͅl%\x5{sz`dL7[Xe:{Oɰ-Xb jʜœo1g4$=L'ߘ%T{sezR :Tu{NY`%Pw?.`yŜ]MmН1D@3 ÌO}}9i31 8\\f/f^p%j3>?O~<~?sfW;=?Os2q pml=d|2Qꁊ X&l6F2Sǃ%SUQrO%}eGX>}\>R֘[O u>rt9zJH Y Ut1*b> 4u84y_7o,]!:$ZLkwtuNr2&7աanCaTr@ё12::G FcHO NUKglc}RXncR3݆n<<}`jxlAɾ>޺bΥ{z6358yrsi4$.=hIam wv}mly^Hs.j%XFF]1rݶX~}y[kf'\3xJ9+亓4䄖v=UfjN.}! 9O7ޥ̀1UR eN"|FeVCR:5,kZJȚou< ʪ9PcEi1S(ښu!i3 F%+l*cD[}b8FUL[YhzЖ{.dȰ|+c/"K&cɦiBTLD!]39$T힜I6&M!)i 5i|rT"ا,jʊ7T!3ߠ"WQ UI%-ST!!j,a@́126T@qh-0*Q 6Vb$=ƆYcN/7nnlxl Kmv1o`ZBFF=6>.6%{^kF#l n7kBrs&넿v.[V!Ս"EbZ" agd^[)Aj2T1YP̨sr}k)0*]|W+d1ZhXcLͮkfR>N1V` # xDsQ-v\Hb<%%8{dBV0=}~D1>IL&/v*g)c~G}z\V<D &ŕl1v{l[Lmi] Jm$,鳅XDž Mçw:rqzw ]FOsɤVy$V/"׿`-+bY8d5A+,b#+Y*`m hy;Tu6ڱfXrɚu*m2)l9 ە8@XEঢ@{Ҿ k'ZIkYh ߘ7p>:\E @h!J"a UX+1+#O)/YvzF2|8E1 ¢ȌG% jܥ%Zs` ʥE睈Iysl1;򤴳c6;tQYHΒLEcv)vB:Y/مZ!*%ϫ9vBz{gQSۯ6cyq4mO?R2m| OeOxtB@8SF3Ww.O? TP`a|0Y+t TJkn9X} RQ9W?;UX n ds E]}>G| , 0ӫpWwvA8lp )B>O YCT$(jor(frd*Z!et[,3:AF<,E2u H @eDrUɔbSrk !ĂANEGbXV%5T!P!^W"([l_6egPH$_3,uG>r@J=VE$V}__}ۦZ;g6ռT4X@۷d PfMuKɞ>YTޙ1Jutf7@BS)̎``-5H ŒECȂ\ZZjrE(I#3;(JQ 5Tb2Bcb@cP.Κ|@$b_2b1Q=#Uw_c5 A\~aJeܶۤEg-H{(y('%0f;-h[}$0cB8(Ψ5OpӒGzP~g2E EZJ\L fIJiNTs -"Y1QKYɲQjQwōbZzr$agB1^t*ۦHi>Z/V: oFn:Cg͇J&8aw!U)ARZfVs@h1EG\2~+17L17˒Ҽbd uDϏPILA7SM㍳1f3]rm$D331O6mAp#3'Ȭp/vLFj-:e2fGŤlH%b̥\ɨtư*R r ة^Bţbu+҂!QI)v/V^-sѵ?Kkuʝ3#*X+7]ζzqz ~pf,n Xy6ɕR=y ɀְdO@:?ր=tA?ƈ%&-+B-UcQ$a4TMed)MT6E.7AtRZP&n`uZrG xxQW _˖|۽rym|γ&Q}zrvyݻ wȝt.¨sKVmo:o1׹:ה։qnWNy ?z84.ޗ2Z(se]!EfC;\]R6T`{GCFo^)lQ-JV)YdO{fct- u͉MNpZ⚋.;k2jT^SdT%Ek$(Ň8U_`_4x/#Zd(mZx,<O hcvΰ|G>=ln.=IбiѣE:Huc&2/wO,{Gɞ M9`D]gsrdF{rd`+#TT)v V +տ QtlݷZr=,Kɤ"]7{hV#ؒ7~MJ%ꍕ'Nҷ|/dL[XQ/lRRyosVDKOe] >~: #S&F&˱̘ͼb K$'-!U3/e$X[_cZ8r_Q콬2rĆ;{uh%_$-ff5a";UB!( g!MPľv0Bc+sL10!ڝݪ};Ca2N{ڼqvghw[v:"̜T@h'n#$=C3[o·Njdǻ_ctqw#9_ g+,+!T%gaꝉguqv"-V8 1h\7̜pgbQb:$ޡ{) ?P5HC@Qi@3L;AOTf=@pn3u/%Soci _>W]Bfsիfعt&^jJ5Rm,Kt4isq{ۚc7AndJ:Ƣ 3C;A0fLjҗ1פnt~?^ ?e΋rS;h;hƀ[7l40,WMCv]{v ܅ʹE3xT{ 7MKf[ u-CA#"C&JXF$?(hT]!`[FhM1B/T\zІn4f\4mmv<ZLfD@Gڕ+1]s= D?;ƽ[Y3~O[hO0X)wޏ]WZHU+ld6x6E4.<x&4'e)ZMƀh3z)ɛdpeNګwO?x9 .%/sd=,dCPX9ɜU LNC iZh|EF )gI0bx}a >+}t”sfٿIHZ;]KJY(˯{>j X%ɂCFsbVC T,@@΅yd԰[rL"n)BBPrhP?D+P0ؗAxKZAw$#"v(u  6SFמΦF-Z6r5לX[MgDSJ\ SNt60Ix<:?WA/^/6PCi6> rQgcE@tIZeQss!EUSK's?,6`D.`ț#Flcfkw-"psBwfW'&ҖgDv]$oϯemb;a1$_!-2qi RǘBNd8(EbA&cvX]HTɡIiJ.RuSn?Uo~%\r6H)T$m٣y we.<-@VJU5Y^v$y;O`n10}t|u/m-os}싓Tg0d)]ɿ,#Ω2&nYؠt`k> Wק% ې@[wU[_ QAS"f-ka4~3nH !0LB,DR[Ll|f嗟Ey~y'̯ڻ_>ؓ'`+~Nɟ`\Z_={R,~OgAήFEndcHnuBN01yݠ $ ԤrJgg*MC.h@kmγz7^O?~]+9yo?ZX<ękz‡㏐jEIPM,XGR-(? LsU786(w/&|\h?^kyքNcd|ƌWz"_a2gƥξźCۃÔqt7&9$-@}WV#h"˰!̔)[|uYbPS1濓RBX~RNIf'Lܭ\D FQU&FVKh}i%QW2w$L!WN=L wjb",k!QE$794;>TIox"o^W~V?bњÿ>D ASg2,&6>`*88pr(BN"rqߣZD u } o ɧN=4Z>WcmEZ} #닣!۪R=@X|OF6pǒ%u[T]NAcMMG1Eʳx]@>c{'p7peNhDUŶYA|'LE0{ne >y4l*'p=ԣ¥$o^Ӂ6/Kn/Х]I2uʜśg.)XJcIaL`R/Bnj f >Z$R$<J&K$wrZՋ׹[(9"=Q裥Y=1h:N?{9>sRg?n)  iO`s_zmF]q['+{8y]#Z Ev[s'&s}U`ӵ~\x\Ρe \iG+TID[zHH2o}rW9͆w)ΰ@dRc\y4f>U O)FG+I^SʆCrOkm#GO;6_GLp&_悀fز"p[ՖZVZ`A$ݬ_XUGkV R%zzXHA#5b!lBO3fE0Ȝ$FH "F.lTahTar&tM 9r-ei@|R:e#ʘIKe#*:V `xr Ze »ː=M4D@ ͕*46(FZpJUfƑ+ Wn/r]@@1ɪ8Bf61w=QbpYbTqIJn·7=y=ܤ bjn<'&4iLnLp`,o}ph}Sp|Q8b+CLo7L]}n#L8Yᜓ7ŀĿ~6bFohM?Oi8Yf?}$ a+էNxe'Ѕz~59$C2r/>BC}M+>(Q9oo5w̃*t'F"$-kW>4'_ξ.&lN;)X>;JҰ ZTZi̫QH%J VBPXƑsJ.)UeSVblOBVr=1KOc^`VD)bQUиלsjuKTmuz1ia+)CxhiV`+C1m!BcQc 3+/%>:2^ŋD&Rr=ID81Ǒ*&Kb â'1z+A(qv.uU?NVB 4l",v'AMg$٨uAMVQn'U*TAjUSR7JUTE )v28rYI%w/Qj(S1}`T;C RE貘S(@IJjUjf$ݑv!ySԅK)X $En/dETྍs1*3Ŝ( jUJ DU;@j2вRI9)jwZd9U\sjSIi y2w0NrN\g.OcAP-iݠMK&_/d16Z ( Z$Bd>IFY{QUTk+hP!eMCey)-͵sVH!VD u^}f1)4V4&Xfc"ngL3kCi5'͝m(L؁P<A(X* 2ԕg&yȫo(-_c&\͔ S\LUwyƵfBiVU O4UMy^ˍwkdNoRcHFzlLK,-1=X6+VRݸl6MFddJԪȼ JkG2DqEDgNNzEfRѓGlNn{ACMAO񟟖W@psPd)2b{cFPm7hLq*B. uO.,]i!aV7$Hi9xxX‧JgϑPHϽ#xePha(AT} NZ t:46l'da46lߠt+JSUe)'%R+ʶA#zKhTS0j*GTWJHz/(Gn pQi`)6J%,-V~kQ+o|PQ#Kr>U%_hTC9AᒉG~ T1}B&g9ůtV<ޫ{D޸Nk5pTPZ4Clps 4Tw"j͊AszX QDT񺚏t(Uŋ|4zS9=RryNCaoG() WGh[Pgg$`xd.M67*~yųp$^^oCU+Mɜgx׏6)y4_e,Q`Qm _~<Tg/읽,$Qߌ\x8Cg}-;X$׳]b U8|V8J-xLn_YRQPrfpb49!?)Y6SOs~F0jUD6CuՅUFf: qͰy Ԝ_pŨ 2B5W>Ary&$s76|m!$\AJ/C<^P??ْj|w++R[z1e%V`-ܶ|j+Ca$vܐy^]ewYsTpCPF%0ȻY|,̮^,̮_z]~Y`8|ŊMg}^6h U fhՋ{Uk᫡~]5}^@T?_$twr Pb̓s)6^ )UL:K LXyXy, ܹB8r@szhZ\`w Ҧ+qz6hh={KM׽mT)@#ϸ;ū_%8]D%SZCj#:}Y{AT #8PvܫPE>j'1t!&W&eE%ĞSV| Z3w-͹`H,s pyU1DklbdbQHl& !Ѻ -j, q0[ P)8R8Qdz8a]Ԏ])NQp|H n+J%%}btm Oj{M/<ߣ]OiMV8kkhn;]j=dwKeW9A˭fwJ#q4Kkww ;fw|!(6-$t jqZ"0c1XLJmDW\#o fo+E@cVG; qwBnW !6@9 }0X{|ӽG<(űxc dT\0*2*P+mbFę|QI3OS C)sǿ"ʐmh4DrMݴ_i1is 4l6t-`ݵnKs`jU1: P9DG?˧w>_MOj)D}k:DRs`1QzWʣl" X*1ь6-THrc7Jsl),` á60ъg9vujT`!z:#H"4$_ .1&FTYQ8"JwKs!@z?:>ᨽ~twǏg#flC&;)$yf?.L>}A2~Z *a+էNxe'BPZOO_뻫=4Fܗ۴IJ̜mߗh?sysџ|LO,ssy#<JRIh^2橰9h&U "kSLHkjb<ꝑqa_h/2r`^M/0L|3T>p~gXvU2@QsW?_3FW fcg>wD!GN94TejRxzQ&3> fv2yHإL5^| 9rݜ8RM 8-mliQlsjOe0T{)G8QNlMZٜܻ^^|w37R^可$=4gHピ?>W MpI]E7# ʴߞt4C9DIv[aq~8xo.REݫڏ??BWF& qcӛ67M.%O%u{9]SmNO Q][$Y(J}>tC(֧ 'ރQeReIJA4J2~ V%2 RW|Vr42Ah$Wk>;;:x}3Og鯳m% zT.xV 8osDaʛ2c J3 pEHq]o{Hka=5~]دbR=ܮ.}6wQIj1M2Rn.P$0,7{r)7{rzD,D5IrH#£hqS&2BN~My[}QYMeߧ>lsb L;/\d,\oL͘1uߕ0b.xtm3hэB`lڬ䨭:`z:5X66<8gbggQ}m gj/fx$8㵻+P:4I[{sT`*ȩ]3mN浺YZсRVwN":jȫ"IT'(/ ) ddDb8xTD{%q S"K9 emE[3]ArĕF{9gP' åf GKY N#B{$!XZS|o M@#Y`ĈI`~a.Rk WXi$ 7/y%Zi#%^yI4Z &*vÍqDIk1n̆fq$ Qx"370" [ok칱 pcg`JM r$3Ev %l,b.G3 VxwT?( lhfAOKܘ#m[u!BFAG4SJ2eRDkl# tHb G ^, >Q0 \i t'Rw- !)`>䨗FA;i3;UF{TB3x0\FSX .&4GG`t 9DrH0ArJpa vwuFH<v8}AAkT`M-H亷)R.豇9V,YǕʄ1cpL-@pUi#ܰ4)iO}cƲ `Q Oms2%zZ)QCJk.%h$a@iP#c%#FS5sF$0f%6OӔ_/6A^Qo=ٖ_ &xR:CBWg!u!6i֣?Z:xf~gapӔE*oc K]('|6QT}Snlj]ub>\8ТI``G12y0_ of0G>fۣQ JBjN $eby,cwkYMhFZ1}i#!eLPv,EQ\:@mwIČ$3W€R6_x%`\1c: #p-ؓv)TKwa5$ֵx?.ϰdB\`)%?5&z*XOe뵢UK?뵁!уvKYer\0IvA{=GF$N?C7pBVO-5' <>zp#4yW,к˭Jʽ(n:um,nGlwϢ1C| pIGs.6eF77?gn4~ugq{̕M-u9$]7h~W [sfX/xcU>$aNF;!}!$ d*k^ۉdj# TH,Z΋+=gߎ)2![VDV~;o9׬-+v凷y.{#ysԿ$oݹ)`LKo;->2|G2hLPuRH JcQn+ʁ/*, /cX*C(Z͋4:dЎ-W\}mտ4Q@yҙ!-N\('q#r؄S9*bY!0b` WMa5#ޤ٢34-(j(ڵ b|[1c5Aݚg|hy23k{X0Kޥ׏?Ww2gс8ƩnJu}%Gʩ[@R.|vǟr>DulG\h ˤI Ygl@*N}n>:ފh'Щ{1*8})׌<X yĈ/ 㨅UΚd~e"xtrFhix*т/#EzF#yGusf3] 3yӣ )VJ?Ew-myQ:/v|ɘ,@_I" by;U._;1L{p$}J7eY\]_?LujNJ_!(cT 8DrŵsH8XFFBQLrA@$qrϫɋ!E5JÝ'F[&Wu*fuK򄪀>b֞;c b8D^j#c9PTH#Dנ \-I+ s$JXrALba*ZzAcJq`0pekk{)`#Z  Z 'VJT\9C<_O/0E L6h-8镁]azg =/1:#X3N/c|{߿+3MmoCr Z/ĥꬻqUYֺ}w$TfFd1b(#c(%r4Yӯ"5*Rӯ"5*M4Y VG F#)Gb$!ziA_F1d2w[ U'œk;5l75cl60 )pI3mҖE1*'0mI4F4H`2J2Sk[n×=W9%߾z=[(+9Krr~y@Mo{b>ނ-(-@{}dJc? B.ʟBO;C3|7Ms3*'mpmcaؼ`z, Y#)xݷAiQMJ(HHzzp? e"Zi"㰿sw*Aϩ%4eHlb<o2|&a2 )(uv9L`tB۝`"~;WaR"b0|3$*fA >-s\,<?`d+/ {9eHlm(ųWğ㑩#KӀh΅"<$M B`Dwo$@?E j4m=['1Fp#r,)E1DL߾1#,48H&$DZ0֩-5O \Q$c#aaWodg%/ PԜ k)Zi^wIOs낗3|y~#::>K0CA;L@͙O%rin+x֫v-v[Zw2tFZRf^J+P~i7AaZRԭ{9Ƭf6k`xu Oڱ۽.Jw>Z7!1 рȮEP m1߾ ~K{`-F kQ .bל܋HAX29Ð oq(#*L|؟OĘWD6&h mmX `GԬY9p!`?'QII)"$`1A$$q#N>J液لePh A1mmkLJ.QȶUx'ms ),VϦV$ف^HlSHmјGDwV|/Z#M餄L?:D#{i7E)omˎ+a\kQ߯e!׊~B j#Snݾ zQX{ $Pd>8@hqL{mb}Ү9wi:SP\'[s3^QNfŜۮNK޹u&;ȁ}FImdׯ9 R`GTH=ս][lUu",{[D8b'J9aD /#b ?!P*b[৲d0gw1_reX4/Ѡ8qa'QZDw9tF?+(a8M&d"1Hƒ_$\KRe0'$!16{}ceKVaWKrpU:58Y*Pa,e97| .TmF*,n]:~^N%s1UR˾)[AH]1Ck̨'cF#$=0[O>h.OW f*I0> 񽳯qU!z7&O`W&.Se]ǽ[~7[_o|8x1ƫ_}v>(?YG8;}x&\SFIa!)g.Ou-~Aw!v!A^bЪ K1AZ3Fﴆf#g`gyvTgN'=jNP Ik7vv鋿I,  .1+'0}7]>k+GIoadZkZ_Kc+iGtq| :]~:tT:T;]e1<=L1u%ꄳ@Q0BiQcH%}\UnB룏3ɔ]+ ڼK&Ykg}Ƶj`mgJL޾v }{e% U04!7_5'x+9 b2#LjFO{wîbzr 85 vpIbc9{vjY^{ ^/ 9nj# ~̊3{u:>f-{ Zlp)@;`XA(th7$ mfegf3St &(gHdHh""b"LI"1*i m$³O(j &"R![-wHy"O>#WLP,\ɘL YȮG't S!#DEYbf۹X u{ t+r}Y} %joC7^7'j u"֊?Nɤꁽ)Gia#\[wI~lmAjc6 ]|Q]̛87͛c/C"Sv rsMg22 Ę՜ ]46Va14wӿ+6d][ Ɲr<orEP!TR#kwf`-[!{5+B4Q8 xc9[f1T:\LNOtQI€y2B ic&bqە}ъ.D }wMPQ+ ]\jnҘ0YZ𰛓)%c;VLbÏϞtƟ1& E_o2<{bǷwo*9F%Wpd73֊C[ci|,Y>LPMdټ17a HN{~uX5{:+dši”@ZђK}5Tla 2/T7dЂj}䟫|mӊMQj '-tS:ʛų[ p(*n2551QZ`aOPq?~/8<^-ZtwQ,-#\yIC+IA0$n*N{B8XmI~[1FwqҐkaDXHxDznz̮6NrN0; xA1a%NK/a,t9qVoP! iE,U,PΉ$,KC[(a~>'mG+%ǫ1ĺm<~N}{ZWxcc֜{aV::ϯPohwnRN1R.7̫l 5U9=j-ӂ[.7(Au}pOҒ#l8LbƤah'xpQX81n:I^9}}ƫΩeʻh)L= J*7\Fvz SP9`yPAE+%:V;wT`}t Eu!.:ٯK[%6&PGl![`]_qlYr^Z` te6$֪#CןhCLu<#P-d<ġz5끠n)hgǗ;$ [O={Ѓ]贐0Gā䠇wqqxRmsɎ#Mv:r(.!QſBKB۞^TR7i?d&M881lZBA WF(F v]-P9F('^ka,!3J3%MG<֡H frX0h d**DA@ƈAQ˜wb&Eb05rʅkC)"qh-&Ӹe (8nZ/.MVl kP|VvjGjx'Q|x74 jxp|hr] *6g[Uf擛AQ`fP╎<[?!+mOOi "Y,U-KNiv\*~O+9+yJ7$oÙuF]L.^Bpg^P.9@!)06GcR}e(/jrJUL#u?:&ENx0`i>&IO<˗<\~KƓ`Xx$C;+g_3NS_OA0]44l;&7;,I |ԇ^9߁=|f/.Q|=?/ܾKYnlvM6[/*LOGavc]\/._˹e1ʹ7T~~.89-O0xrߙv}YXw0\(^e8uO_OnbV|9F_>/M|OfٲR |Qooq/pq|~ bn-Ћ(?_8~fE53h2 GSS~Lop'7%Ww1[U9=z@i ѫ^/e>9ubp ft4F>/FUV޾Ty0&a QN0R,c&_Z v8deYKa9HnSa)¢ tϧǻ9`>Y!K.uV8q޵q$:d`s N.?b/I8gR>$0lGW]_wUWOf w4.6~%|J<&F"塺Ha^cBxd #-޽k3G\Gba x֌BC^TtuJ^iw:87վ .5`u&4)ˀnAA"3wr iYeo/gי7{z7,?/v7b%+F-is=h ׅloY=뜕zp/k_:LQR+kâk_?`*٪|z3 T ^}BX_W4ۭ&nofb **xx?*/L)/0) ?NmN] !|n8=Co7r>68OO7Mzʋ@67XtdR|9P'64XS= VK9xK_D-T5✈tR .euGnh*gGn =ܹ !g+|(.SP~h2p>?}[Rj!^&V'^63;40d-!$_%;DV7p4ۥK_hˁ10KXttMbJ RSL1v3gC!+8{>u|&MH"H.;Ǐ8M胙=!f2 e(Q&` ʭ\?MJ תe?(% ]BFeĄ6 }s)~Dnaa&I6xn*paɨ1Z2+OҡR3x<@"D:hxR5^Xy<=S߈?*NVXYRuF|lTcO R3YԊCs㵾HT؟{Y. !r\lX_;X (PlVYo?fft+vwrۇ?ffm!ppǨ=͍#4`!qB"2ŒD#n^YJzv:$u2&s_Bo+ʙ F̓2oh!zz?}`:-ٙ޸KKe䗯iE+[;$sh{gzLqS=sVqT,3 # +pΤ 5, KsC5)A9AQI'lW:Z;R_/׎ִrhx9bZP L\bn {㠝 TQ Wdϵ8 Z-΂gu-P%m^Nz&X##Wqdtr>^sHtZc?z^7.G0*2&p) ΕD0EArfR5YEMH %1*s;;;;p +[&xSK"5L-0ɲw>;jqʃYU"H(Pݎ % ɽI6(z ]N +*7(,_H3@v4EIv4;k*UaœMò*sq/<EOjEJ{^,>B.R\YWxRZ Yk)Lk BIALV˩/JԗNɅ(ī7 v'޹ΕY :μb= XsN@Sk-9~+|ac-s .z f#)()VS_ީ/]9tr, TQR,N/;zWOSWu+B]6ă.N;C q r[U-9XūzꋒT=; J!vx\ Qk9L$fׅ%V< 10u:(1nE)L@%J@!˸oJ% 'WX@0 '71E-W7ln$=!+ ӆ„ӹr0I+([X ǧкk|A_뺕 qnqJoبIy˜xZFa*Uad.VHU0+&h"TH~(z'dygᡠf"́0\3iBLr IYؼt$Wy+jpYʊBuHI9-)6R"lss/5ʡsN"d/g O9QSs\hNKВQ pOgP&`BDi$6C,lf(&80ga)HÑc,,<f !HΓ4%g/MAjk%"7m@a0B`a@klPh(:*`)0K8uٞ1h1 ,_;<`:p8EAl_jޚy%q?^]^&|%kR d n3Bą`E +Ks`(2{%8#JT9:=ВU: eԥx[ T'{r vQЫ7EISl>c變 To3y74[\ +BӯaІ~v7 lm3 :?x2YH닐T +t'P=j?p}hU:,qsww"1Z ]T7Ϝᙂk)MU-)vtA̹/ ~,~^Gj(4 9Atb`mH db)ȍ%L' N|zp2=uΒ5vb_lG!uԛG׺0RPB7S8e@@6eXwXrM dH\JOjiQà ƹޮa+^V qAPP* ՓTU!]3 M땳mS!?1Hdv-^5Fu=X(,$Vʫ"O[eF 7^Ò~ywq3&`1,+b^޸vzU'XԌ:CX8?QlbhNIyF!AHYXŏK|v&$Ȃ\<@*(SuSy,j8'ssPNUBwSSSukN @OnT-Ta^Z&ÖdJkaUr?MlSnj=Н%QnM4R H-[ mUGf뾒&R-QKUz֎^1fjQq}M D=X璐t@ak@A.EMÒ%ĉEQb&ny\.*e Uk=D_inIZL]f%B{ `$͕#eMSCpotNP4KФt\s*ZDX6 KWOń}X>s7k>Ho$緫볋k~ɣ.FXZ\"'x5 ,ZT@oJ6nJjnw;7^a̱IPVThW)\J'4Q2LiT1-;QQ-;ucS,T'kB(Ql$"ʼnWZƃH*$^<zʧ|!ZazgZ#FRXg;pzBUgT.H55H>Cף7Ʀ#Y6O"#KWppz CBz+@r ?qYpZJsE^!Xc3xliD=ģ Uaj_MR,>HG%tOZXD`fk@#`Z*g٩^y`8d]ҹ1YڻIbG7i,ڶ${U%u1dbZ B!ɽKDX:JV䉭q=ڟwOVS[,ŗ1'[{|w%ږK[+6N]ob]uy"̚9Bu\`Vy XM~j8=-Fs^Wl.>MG}Tet@^ۅop]$.GqP=!Ƶ-1`ղh'~lv6!on\șJg)ɨl*{*f{6, /,t.p`d|b+$8+dӪf7-i]>{˭ZZSwX-Qf-q~J>CNRWCbrwga'R2o+}>]ԍ>]xë?bԨX1|1E@SH+"->E"z]4_$Q=X8:\:9q(͎"ep+"٩iMoZ8T"JXiEq rn6/%f?QX[48>p# `~6<.%- /֙ՃSy<\vhjGPΓeE㈍Xm9S *6$HҞT 1x$z4O56ʛ )BSDsU$taג#ۿ}Az&v(~6珧 'T)V i{ZϊEOht,C"Ji5|@'dmz!!4b n]=@04 G s@I&fsj\+QR[ YwƆgbӜ<*+Ȼeg9 Df|s5 b9jh[zn`^^<]Zʞ/;.E9R=mp@wL(lwVoYK<8._x$g؎nswy(]y-wWq&7Nȁol6y}6l^T jeCn$%:#0;o }읈БC@څ>WQ-t#KmӞjk`hꪨUH;Z/: <`)a4.r: Gf;k"h9ѻ2c=Ҋλ>\2u=@@A<0{Ҝ;@JJ.ϣ74fۼ˹e|7"Q\n\ huhQ q6?80*EўoஙMΞT|Aˁ鲺IepdXGVF3p/rӛy"_$>dY uK3>Uwٲ:4+ufMY0;̮wڸr~o|Nl4iyA4-ƎᵽquOhlcNoe zύP>j+(:a1V8M0ieўNQl{IE_0|w90|wp?ܠL(:pI6L 'Byt a)A_!8\3 @ GoӛElE܏n6ȕioCo;GX{h:|VWP}r7y!0tP ¨Qc=jD_Q m dJ-ƜSlDGl%`)BG‘ Zc$ _N86pbWgſ@fla^ kJ>iTsLIhr%b$&G,'oD>>ee 40| (5t$Ip7Djg $5vGМ9ZQt5T Tu͉:vʸ.MYezQP@K-62WL[q#TCa9U\]fVYn8=ΩNģru569y./kf NBIl8@v:՚Lr+E;XhjyzJRdI蜓ETEX`DĹs^K=̗KG\}GV#Zsc1PBlx  O8삉+6f :~%H@/7۩ņaf.Za h֤1`Zk,6^hm6*c5Ul:&eQZes b4pBEe%sEAR%]U!U>SZuI k;Ԧ>'NZךe#F'"0'bJ$ '&\Ǫ5aEy+Xww}v8Ta.:׺L` 0c[tժ|>&F*y9ɧWW#WM`uyT2ͫFn3(uV|Sq{D%;n+Ԙz1|d^vbW?|'Әf T7r7],j3KI1⏬o^#'Q),CJ*Gc࠲ĻJ/Yy}fKĊ2gC -0Gb/Q1H{ӻ9:6bJآB?ZtMZZ2/@Iuqa\5*GGPAV j#(nch!fpdJʑ$_0vW呚RGk ˜pЌSH.^9W+y؎ctGI?t"Zb.mc _KB0;ņ= e$^]bd_$:rW,TCv""L>bA70ZX"` >n%b҄6V^)He8|,qV8@CL'Da]1pckvCy@@L 9IqJp{ AY3n Ωaւ; Xd|t"؁m+X~s3pqg{Y:&pk7\͉ s޼ۧϽU`:\|X t~.>Tʒk/G3EC-}ף}jz_ܮkly)\R4LusTmR8tFT<6 )C{s]PM$?\ 'j)rRC*% QJ尡0V"f"V 'Mj<ņ)CNz>ݑ^%[q~??MNR4 ;d܎i$tTG5If8#@$ !$X5CពTTsQ5=qę90ƙ5uq=&g94_1 ]GT"2LI- 1NQ\ZD3P BB34j' O2q[l*&~қ0Ll9Yȋ:=PӏfyzyQn`O 7Od\Ĥ2vTR=Gn)kŧC .,̔qC~7z` = QZۯ6 zVu~v#IKxq=÷_uQIWgz`A$c ]jV0iԊqϘZbi$#fZGɘ@I! VRK.Xi2M g_ؤ'T(چ,'GZI S )U!Qi-$a HIYJnք*Q'rb]$PSLca9iFg"F(B JR9B+®qKPI*eBİ7N;uU)9&3,c+Y< "#0f-Sv,$L?Gr^jA *LfJ-"" 6o&T!ȳdX㱷T+LsF&Z.fw]%͂zpƆ,:?f/Am3o-l.zZ@}LA7k+ ./R<0*-m+ᷫN;b/>؛a\}XZqOQƀQy!$Hb% 1/Ho"<rA-N3#ҥ I{Ɔ#!6LAƛ.;"cWF ӏ"chM5ۏu/nzTĘN3x 3i-XBrM)>nwKAԡI3:ME`\/s hM5;ysalg=0HAafPb1L/L{EKk򔘎GZ}>"ԊRTGVݓl/i^0RKN7Ҟ{.H%[T ȗ,gg|vÏh$]q:CŇ/V%> 5bvl$jLj?ջàF/0H܊zYKyU*z9PzP@fVKWE4_M$GMh,ά z s%Yѕ*w(s%Br`ew]m $Ƙ#nS?f_\#gocV6 e.8/h>w5&_ݽV`%Pݹy 8G*@T5=ȸ`/(,U ?4.>ҭȎ;:`t%UJERAZ*$Dp0,V)+ 2'cS"O %`ƴvI?k3N2ٯ{3M0ǕӋ:3g]42o}ꧬ8Q\4e,ZC-;PWט3fTXT`#@KǴR"( {&yG28,|*_ S-P;xܝc8}g@hYȬWbT}KboL le $Ӂ}paǣp6@}-d{!$CbЌzN̰YA$,,1 ݨ H~2NB77'@}ʄ jJHg3ͨ"T2aC|@EOi)  C2yfw^šR@p`;{$sQeBjm\ڌj4t^TF[)DW:f fXkR%c $f PJi wIw0zנT%=cn}i)k~oﶴdF}k5q>.Bb2*,Lg!*+HUH] "i+3 "*zdW~c-X8m4"b=r5Sy F/i3;2o=WߕS-֔)SjV#zpwgrgԄ~R|HQu+PSYax6YW M R%sW,XRPɤ*U_ L-)6.R?ooWnư,*}J,Ur`r#r?_G%2oI)k`. YrUw |x4"%pP3MPN,D0BdrEwyN9i/SXBeRʹg'HofJu ]iB3L |rǁ"<aLe^:/ KE !̞uŸK B,TsM%l޼.2vBd@0{\~zfO<ɵsO ،]S3C;~$=cfY7hͼI*OZo̜{e$s|o 5+@þ Z9PG"Ԏv@2>k"eT֬ )98:CBI [ys@]c @o1YSfH f\K=עu&3;BJ#=>3^g @i0 GaC/BRwc+.[34*+73҉F߁東bg+9MՇ΋ΏٷQ;>LF捞[o4\ͫBmDg,!ubqAxs? %زzt[:/RfXQ%륮noR{t=}waVr^y,SZIVdvKAIv;u `iVnMpg$->-61jn"*ivK]heۭy,S"O ƱR;xR/Z c=}JG/ZǻO@HMO_V_=> -Q^-uR]6\D;<\AՖ@+j7 Z3BFB|QPzʟ$~Q%ڮ%a1k;tw|4yhyhuo[FI >1DNl .#B"f ckzWBa?l<]^ l 4R?̕Z&vB@Xk M>3_3^k ܚ^c4쐏 u>S.k /Ў-&HXAeAWP9&b,/Tf9w͌sc.Fܿi?ym{h1a$@e *JQ5<*rv֖/$)fB2ZJ-cJn(.3s,]tIDW4uKrγ(`uzdgM;&%T;n"; Tj`dK"1BZB[;N).f wߕ{_/Ϛ2"!:UVSǨZ}8 0CZ5 y,SLt)U-UX'u:#_HBj&8䉳hO1t>ɊvqQ$&ҠRFe~gz[B+'6!OESxa|gMB2i%ҠƘ8F-pQ)RZvkC8&Gƹ8&[* NuϭS- ݚ'΢I<%? YnE9TT:MS>LvK^h pg%y _HNʞ<zNDvz9&r3H<^x[-FK׷aqr1M**GrOxJA#CQvh2vpLqT#1{,B[Le&W|Yo̻Нw%Lvzda7߯t^;fi2f ,L#sh|# .7N (6pV LfL#tfnܡ#r2DZs7?&nk*,MfiD2!ٜ14?y8yC#Z`wԚo l$\M_?cp5&ó(Q6}oVDeo*8^wWwrw,k@jmI&B˹iTKtkhyA@9[ Gho2!2~L2.$nL53.ς2AƅͺF,xahc~Aj ?Tk@]kpRxpP e pȣs+bNۡcҟy(ܘOх|lAc3 "^ǟX?>o.NX(fҦFbBwowQ/;9k}93c4{ŏC&am]wmO~Ioc:gs[ 3ngFG|L.67/X[h):4r1W {e_¤=`\f|a#uz |RӪåAgþaþ:6E ZP0?Ҫ*Sz>j: /?]xa鼛;0Lcngd黼(]͹VJ˺üHԅO{9F(:Ǎ-=.,}92>,a`gsA1 SF c}< 2 nf>Qx#~6,7?j"+O4Śyi[\vcƀXZ| a1H=Ւd/u՚w}ڤ7q$g{(:Rwm#Kbf$}$ޝ`_^NFl:z5 }IamQ$~}TUWWU M.0AMJK@2Y"ۮom]25X3FXb mHha&8p*jXC}͋FEKOUE%'冲mV,KdCbZ6UvmgmcP6x"U 2{l͏=T;Rs5a(ypN=u @9 Ҭ<_T~yAEΊ!;XH ,ե"}`-zAg@ G/SV'T^x'XnݒG.CqKhZxh7ȹusi 26ݵ{\KlDyuϯqRTT= j]4qLg#Bti$ͦn%@nR_B9 I6w plI•hS]JB$cYI.a%! !֬$-L\JoҖ,dT_/ErK5j3*KU{^fo-ex^TY*m 9Ȧm༠If}>o:(R˰4OVʸA6xnMN(T9=OEwا@j44z~! gm(:]t?4!iT=BAg4TŤ)x3cђY輏c L+Ih_F>_˄sZ;1P]MBKBAz~{3̯JqAC[huD-?o?i./Z> c{dy5:~ABf?ΆvfǾq; :_ =-,v܉Pq˹"=PlMǷ~3;l4Y|. M[Yo;-~|M] :m)8# 鷅OR.t;e8azmW.:"Hi3d ̰Pi}$Zǰ[ !-3LBU3q}kd@j},~ܳ1|/c;?;A1͢ @*i "v7#4s]èLu_)$ZJ:3Kd62N̋{wѮyog+w~\puL_{gدb=Q@N;錳!zktaRHF:oP{To^N|:=6>s@xjҷO>O~_";798~Cύ?_h2 !|r}9vW/ vܟ ada͘\Qƫ˰7~+qgiV3/`jKތ0-lBw0t&߁Qn\i{5;О ^"at~m1U黷p; cf)|)Ea⎝W FmV.M([砨'Uq/HC #eBrHyPc}yÒ%JJ[|(/N&5O# 0`R!Ð3m/$"< VWl^w=-[c)հDS? PAtH<:$1؆ 9ƤB(>*I*Hڱ> V0 n2D 8RBR<4$-̒J+vaW_ ..׶7r}m(04"DCV q8\PPqQqMkph(0> <*?G1v1X kbOc GrsF(-{+\eB0zJ[?E爗<@wfR<)," %0;TӌT}3gp(j/bٚ})=dF`>e<ؠr$4&|*86<*={8@{Jf*Z4WH%&4]>E( " RRM1)$>&~ZE fŞVE7< ,,"ak cw(A3( IBgL.HYO!EM|h$6=r1S_5 JRʘ }Šoo-!$ =x^У,zXd@ź8{ 5lacTFU2-bzc!{b&),{פ HTVVwr69vkQ8ѱ1YvG_!D˺Хb>]wݡ&qU.Qs˅y#{1I&uTʎ1rvۘU [7/]߯Pjp'#uI;-P3Ԯa.fEY"ŠB\k^,8:zbet:0t`Dncfũf)a(̹L0b#YjI ns- MeAˌۛtifiE_SPi<0xb8Bi0UK욮"b/4ah<ļpWN79)DTɽ)Q=ɸn0ؙuF|qgg)-4ר<dȒXEkn*]CB4jj>Sثث9`Td^i9,wA`bUiwGt6"3m7&)' )oQۄFSh|7<IyʄfY,-͗z\!1|N%)~Z.e0u5BNFL`w7|s 3[tAa\EI%%_<&j ՘PCmWV[:n`qSIh;/n9Q˻[bw x7\c&8n QPi\{y\i*{k˝:;f$sKntHL"{~Gb<`'`H"Ax EP]dJ%4I=HHRiK\|8miclָQytƌU`)BCV!SC&MɐV8o˿ TsC  uNPNa,`v"{bpPTڈ?C*Q 0d;Δ\ Q3  _Cfngݥï$c*:-9+ Q@eJ".ٱah92RgΫI2W.6`y|e]2 hnGУ"Wd s|g۸pB*5-]&}Bj)jngت"WwwqI{c7_\8` W1nNRA0]G;Lr #g P< `HI.¤ n桞 փ6lbA Z |٩8m,c&N&1&g: p%ʊp"9E癄&@mw3 B nn~dQXwvo>fDzҪ{#&Au (<ֳT M <$w_\#HuJݨIϩ鶞-=ݥzX4lղ5̎AyTiL{+N36fPnSP勪Z=hKUm>[1DgjBqy5΃W~Z4`}gg{OM=W@3YnؙC *rsQ~`w5hԳ1@:bZ~Zd+Z}4l - <괨( ԂВ F32"CLFSGt4jHXpzcԬ`BHB֍'Ť6<OWI1DxJIS.+jaMlPIt FX;R5UldC"2ԚrQUK671eM\] Uj_7ɫ$C(grطIodЂg/֌Ŵ5M?;7A.}̮o]ՉeE*Ue~,o(μbYm=S@ !*(J3Εz~S1͙/-M!܁w;ۅ'.!D2rEzX**GL(=PG||7UN9XjZTr "Q=Cdwr)hKx}nr5Ý# *4޵m,З-ӺAqaP$e+)Eb;,IIDI|-J¢(rfwv3ITTSOkw-b #)p!ܟW/.g@E Zƀz~׃/ X🶲]>ͅ^2NL~0tx)8=*+$EGF,qͺKQ(6k؀#9J()ff $;ee\< ^N[U..|7`Hg 4(]_#7->#MZ†i[sIkhIv ";ގe (JGcA͞UΖ O0&a֥cEMg +{kqO7"=MQ/9-3ƨ0ҠQ&~87OzCi Q*tݶ8g+a\/ 5öHMd}בlB5n^W) c֑7\e X0fOݱ_yzA !9Xq5lrxI|*7vXh &1dg}mT$n'H`!唩=H@%(눭mzG_׀+X,vWҔ͖p&X[_.D}6] 2fՒfS;QrRxL(J4 #X~*⁇Ma\86-Nf6SsVS% (ò M0yʅ;vhof-ӇL VBRL pእk߬á"BYڞZmdS+. bvRe۝v1ts ۔MX& +7nY\|˧ aLl!BQgωh5x^?|DA :L,$=AO[3sL6 'qBYNϞ'E7dd1Pugɏi%Xu/`*ލ3翽9V5DOj"r/z uR͐ ybyb$<$yg=tBO0\~~65^{j\nƓ;6K4%,mM~'?]fɷs ^ [02DAz M,̓7= ]~D(2:uDYa%$-avnjjjvr}%Os ymz%E XxwG1X???܉qm)L=EEB/ on4§VKj9%a6UޓX_ >ӒcFs{cvri>>G6ʄ1p'd^8Bɵ2#ݡd|:E:^xGWWᆻcO=D%,M1ޘ{?@.=hyÆWzoɦMK~E oig߰{7eWŪuyߺ/7^.aJK/OLe&Gl4 QEK̤rP8RWS3Y ^>s|L- fo.[~dJCYJ+;Jj+Z|Fb3s -Yi$?3i0MI1/RNSzHkڒS-g.V?r!,GV^H=U# $мQ˩ǿ7`FRHǐ+*ЯsJBW!_Ƣ A29%F3s/l 7I%'H)_PlafŘxܛL(!܄^!TVpߊ.O8`) ..k (u TK&,m֧c[=:\0;@UYG|b5ORH1r 㗕Cr~6[Qŀ Q dIcbsі&hIzPPxL|mڼC~dI6tbVP)M|7vvjH= NFiJv>O9 omcbͩ[|Aк{dx,Lq}.$,snnb*0AR%ktsX|m^*ŠmΠ}J^yWDI#~:ōJ;9>KKe4K< 0)RnAsrA.4&c:F(N$NH BԖ[ԉ+@%K&ez17gdp K"_q5Ƿ<` hfMUwhԣi$$^XW|Bb٢YNƿO1[[ړ. dGu K4}k v Jܸ,fdmR|Ҁbϓh-ne͵ RP;݉5)C> QXǎ{vuW`pL-6Z*_FiVCS)K]7|ڳ{tD]u[ -ःQ8óy.I>A<%lVv_~L9FfC'e^ދ zL@/@]_Sn&(PE*pXJf9U 4]"Z ,1t}6^9#P`@rG#_PDW@b%"jjjj[fs?fhfhfhf<sb r?g/ٵ;tD<?蟥?+j("d-&Y'䍢tYht6@(^f:tWjJoa w,`lY<: A:7jh> Jޏ0zW8FgX, &3;owc'NэM C cf2Jpa0Λ9'hc$ 0aK5FckI}ZC@#B$Q(  VszxjF 5 0IgѴ=O.&3s &*auy/ZLL&y\51bsEp)KK|,K~Mfo6O > |D\pʵ }OSá4\ w $`8g+`Inyo. AQ^7`<6iFS^ELB{ռ*P\fZց׌LBT@@94 ՟I *&ki|a YV'7 ]%/׋SK䖉-]lmzlK-O+ 1;Xny$S^f@f\8~Y 7]w:e$Tn*=4gjJ찷=-.w\繱;,-ϋOXtZpOS mTUohdƚٮbg4,)B]}rA&gQ/!" T;9C >S覟H5#Tt7m* >A+!~%qQ#(F<FZCJx  !Q)w>O-7RݸMAl{|j@=Qρ3>V@ %xW06/5D_:QlOSĵ@-ҭ8!&/v׻L0%0f)W+gT3;Nf!Xt-C,OdImyϱ+6ЫGpry?ϒ& ֖+.ZG=v@-U"-uY 81AR 跧p4!wnM\ b xlZœ8I[x}og5+owiG Q .I6>Agӑ?PGWX,uwC<z-*S;@t޷'zחX~KjTBqkkl}cLE~W91_}sB16lkxs"#,{mJ!>j2L5pVT+](~/zN.7ӆvէtٿdth@h$p)M3i}*+RX㭗/z?C&d̮GM~ئ!nl&_B"~SdO)$WI/ZHI FXjgWwH+P.nvʟL>Sq+5 3"8Y%w؞<%;)eP9Ju\qBH&4r"bTeGOx*o XκfQZ<iD,9~"O> BQbiog-rHVKyp0g"}ԝE4tt>ԜmC_:~>):6Zrhlfr⤐pĹ RɄ2u¡W$WҤvsYk; yj X1ŏr9lP9ano<ٵž*u!x {>'×LUd? Ҥ;?]W ;rգIw7!~OԭF+j]\KԵv>Xsė8bCݵ>/r0׎J{7 (Z3_p\Br̮h\Y_w@a~1+?O/5i47>q>k7z|Lүv{qpgR4UK9O;:+gry?C =s7t6?7>(t=9737fhqgbsХ[9 sOc:JsA+֘}뮛`wP.NEucg{ݫn5͗o{X֤-P?{,.~"qMIQ-/)CP")p9J:= 0#k;->{jH{9ak+$\͍U~ atpUllxkL^N9EB 41@͸M ٰ ,,/[ek}Ͷ M W{1l;Z)Ŗhilx\qVGhAfKx$QC0$Zj%(GTcaQ0)rEaW!9q_@Q.tQȜIcKlb 8q"uR֖c(Z* IscJjn!(rDy+!L|O0Azb*Aja4??􂳒R$)@| 11%}8A l&pN;}dT\?N)l[sq>Y3r_(߹Q"-;3tw{7hBHwI @y2cľ \@zGF!`$ޠ. (.E` D^ AJ*TSBs"XRˡ2Tf͕VC4$6&9r 2^+b"``Wk6q rvn-(Zfx{]т+B.Т^ZR-N!nCa-­hQQ!3%fF[5LSN<ޏ.ٰF#b%˷=9Yi )!A~$>]0gpKv#]_󽜇kx pD ?u"GqQwW76J+}N<z4I 3eJs3m4$J[`:Fre s LT䍘+ -/1@\nd[/I X#`U[1(Po:;YZH~vvԥJj#%ů/Xbgt[XcE"O X.-yfz%d"f%C$xn$ 3 MPЎS߿}xTgiz%2APƯdXb4ID"Pzѣ ;`H^ְ[M񭽭zV>z;ݭGV.AYG׺60%[TF@8! cў a^:>4"aHMWQ"/\?LSiЀXIwP˯L@M*B4;Aƙc08f;AH 0n2bj&YNuj-;1C@V-9fh#m[3bF%Fj.,Ph,DHC`Zl~yIQ췛۫"ʸp}SjžfQӌ(3l(3@3 ذpK2BØXX2XTrʙ!RTgjvDTBYH,usMf/7 V%]wȔ1˗:uڇ %TTf<#NTn*Xn_jC^W^;m3jiụK|bO۫ꏺI|;)pSRғ(A֞MEmuf5gA@Eg[C5V=soGqP! u%ZZ;{'I&Z%Q%.G̮TyAh+hҼpC jYR),7(ݺkbCx{X3.ũw`fϗ?e`*`F3-#"(Z@1ZFgBe MMQG:=T3c`J;72 ksoUقK?o`T<ʹLmvݟ^R/UR/UKfOp>#[S I­j" as 5Sv _PH*aAN_{/z9EY l"M_/oo SDT'{1EB{Qw*9};w*a sN@aZݩp$Rc=' 0>XPVM{\'DX>_Fd_4JvmS! ct|<ׅlgyG.9;e_ثT$b jW390DCӀ~ IXm7#D}$۔H4e$-|Q*8=&ӶԊQ(lpPl*RITŨGfP; )y;@j = xT)+$)Y.9x:]ѝs#k#>}uj ]ϒv8}gsByD1| '*FLW\^C奔  ϓ+)AnH ap)e\ 'e 'BLeRsP(BwSg"o3~*-b3qEVdR+#QbZ}Z=芔CdjlzoI{-p!swe( ,Q&7 rHMy!/٪N_lȏWAo;Źt@ k 0߹^xZwj+gvQA',af-:{aʨi طDG.R ͦӰ"H2D#3H zǴ1hWxTOcBvnapoԦHn-4S ]yL\+ Oz'EDZQGGTEQp`׹5u_ݻcYQ Or(EUt\urG:n}ESdz&XPֱifbܚm /c8OȝX2m{¿g-~g-~VoCDm$.G&0rhZd TZ7v(a獽{7, )̥y^`@9F@;ƸP_-D1A b6'[Jp%Dusa (af*NJ+aY0|*p6p_7W'gdzk}6{7gw0_ڑ1{E7 ˜ýa<(dA|>l0=ʟT=??N)]|~.?? x?xV Supqbr%G&Oߜl-g͍["0+F?ܨY~s=[h+5!gwSԋ$ڱx̹k$D piJʮO>;c&U~t}r{:s1ւ.;mH?hu F.T)[]a}v;rN3:?  hоj2:??;ށx_~&p&Yc/3>~GgT0 -+xAXS%6ȴ8erx02M{zX ZqTG|fEօ3੾\ZϥGnVTveY#QQBSU,+P&wstWfZF'LA?ľcvR ;,c.gj*Oνa9/X:RYO$l5UV{,76BDg{= o^d%Z{=!Rԝ5q)xBE.F`H)h;t (]`G@ֹgc'l7fv4xϝ,İ̧ T8ڥǽkރ(`64%-@vi1U-^}mIL Z[mNm ]a+oNg߉FgW2_'W2T{v2T24(1]pt*,-۝[ |ӻ_zON%u>p#]AMvKmBvgƵ6އLݤxRN&FXVw \(y郝drֹCrEJ4]toAd"@ǯuٻ/E_yXꫣh0hAxrZrޔrݔdB!g$ I}MT+zN~#&+AHzzIWp`Λ\}ۗ\@)a.Z# 6#? љ`dX`Pv^[ gK"9 N'!tU ݎwv=JB -67 O٦xK !_muڃRI8v]a 8S0)XOHǼ=| gҊRCyV dR 8}_YWZҪ*|KTDFj2cj 4ijPŠ\2сX%uUќ1e!Di&(v8 pN/ ɱ- E2bE`l Ĝ y:0<֓=kB{RMړwk1tIAAO+1n>S&`hݦ޵6rc"eŖ?uz v;kVڶhS2nn=ƮT2`LJ7ս2ax,eU~POo<&!]<*FղcLvUb,V4wST^_ޖc,T%wKqu]yةq٢ c"VL1ٯ},eJuOxO,q}񬀝qz@e/A9c9| ]jPSb;{ntWe2c@iX sL!!sq(_6-9+7cmѭz /lZJ](ȸ%d(Zo]xBr]3;X~xEutDZ?.gCf6"Pal=̩A8Qh76Fѣl<"չidQ;^1.s<#Ӈ绻AcX,!",) ^F!$#&U}{CװsNi{čZsg2, =,V@Y@s%^t^/og2ԆE蓛_ ")d3|c'`{Qx^m㡀9/#У漎@C;(tl /Tb dxWÝkL !qL38!)t!s"NInΑA~$ C `0O?(y(f;uOc0z2{0|7FZv|H&]Q\@Q"- 1 CZ Mx\#áe*Uv1}FnOJ-((s  Cm7cizL?}4՛~,6J[;w5''WS ү pWS"4TTudpVWSPS #!|̦1afL]bf#+퓹\]Xܞ_Zu6LdK5tMgVnS0:$fG&ep|;\"u`> :grKqTM^QO(8e:ƵQp/"5^z* "9C_.j)FU9OU;5UlcjZ+/Yڪ[ ˴|?Qg ۘy;0ݘv vs"༺)DOmg1$J\|T}A}Pp& Ӕ[{+@2+ ".4cPjPb6+#9sct#N!ZRB}$% H! PrGrL2MsHqGŁw]%ӕįAKu"N毭[ndʼn_N9GTS@^:L 0,>1;&j78@.?@2;v,8ASa#gwȅ}dg1Mί>{`-+WZ@bG@ZRo/fym+ECB,QFP CzĥCdBs,[.B.gTp#naE5+kb༞{1~[12 P.pp-Mit$T[Eq"D[AATviTWĄfDy4%3(ϯ/&K]OcV)BɅ ӝ ] ۈ6b=?P[Duϣ@-?(eQG6șcŪg76zzVy|unN{ RW_tF0Ot'֣O𰔒4JK"PB $RO(Ĩ@TǝY*q7=zM1(3VڙK[)O8Fx2oo.`cUٳF]E @o匵L)Iߜh! |uC!sFd&h&0GmQA>"zǧKsPVA/)ǿl\lIX rͽ_3[էOvŕXL?>W|ǖe$of eFrzt}]= q2<'x"쁤QXlUI5kLT֠TB"::HɁs-V@nfoq=~|@Ŀ~_c4qT$Huq/ۧMZtdCШ?hC UYb}2G1Dw|hd4 28v@YٿTDY#M^AIJÆt-.+B1ZX.4aa35 #  ?e)N?.H;`B <BtoH`-4?\:Ǜ<;&,} 8Pe9 )As#ξ#t8ላ)i/Ц

 q+ZR-"tm$r1p@JӋZ(4&I[r?t`ؓĎK h7CCaP`,t^۠Ma=:njc,,ʻ_an՟gcT^_ʰ@-wWWK֩6V>Su7)dEU!,W?X彊/~d Exl:Jbfէq*xVRr{U4DSƝ3XY+DZiD Cv't-ʺV@b h6< /% a2-驝mT ?Y/GuXW.(^.4DbT]Bc甀6b!,![E"WB@yμ$Jc:׆܊^f"Va +3^@m) "Eu4,D3>؈A@!0@C(864g&01P+%\j)@xs)X `,wPIge<@q![E {&W "HE:jC8ÂAN6X,",f!6r+e5Ggae)^PϑE)xp3BIS0"D!aW7cTT3d08$t:, K >+88X(ʋF™nGZ$C8 obU }tb5#r) da,5!TBkt5ѧBRbFB gD$ZIfLh0Ʃ)pSCU$v]\LgZH4$!RT+bh\];S48Oy3r9Mi"4h˕⡗iǁq%ͽ+SH&j{.3*% 1ĘZȳ# 3N!ΜkyjbQ,8<}e@ DoGr k%) 6jq5bv&f*{އJ&&(5묝[g>]&3?e_|M֨\ tSg(vQ(k(yDt?̊" V#u6OB;@H]*ךNR4R|R5!.jZFc_8Q|UN$S s6V3MG%YXΤ-igY%įSʹAR-EkfftAҭ9Iq*hYKt)r_ 9o߷ %2rr7)Sq;OG+-gVRDAMhh';|cZ%Ix/RLfP%l<{`Qݑy9EFΗ:^wzƵY)7slzHu70d)jZgzɈ֭Gw(1{hztϻG@T Uz7.ED@O nO?)9C'"ǃ>ޯ[]'D4n+鹎Mcp:7:M&;,FCPͥs;d2/Ps ޻1$M}_ԋ˻64C>(0%W|_W©$j '5bP=UzAC1\K4A'ӏP0rDV{1FMR98w=}P0z w>-}P%p}Ȍ@>r÷SHXcFVLMU J>?Xm~ɅX-;S}fA1Ȧ MR.I ؾ PUZ %aiEd z~$J4cX[$5)"7=Uk QPŪ?guaԧ7P T7{Zo,~0 lA3(Nk6gZۺ"]9|?Cx ܢŦ~BQH$MPmʒbJHJf!Z&+uwnke):~]EEI- xtS*`hNP#yǎruFP-{ƮHjV!Jdyϣ]b !lx\G 8jDU r{@@: WQ 8A[dFF) `1@rw#_ck5rFu&l`,d/ 6{˯L"<E4=*-e1H Aa͔\sibN&`el G,Q ˊd E$X100Km:ߒ Kau$D/mT2MLLĠ5w\: =#i$=s`JѮһ_~ctjqoG0fA߁XEʶAb)?.l]juBdڠ!%]<QII4,ndiCV+Ԏu.ԓt)kۺU3Zjݚ@ȉC SYŋcY7wA tjźuqe֭RBNUw*st;sP_htAXBb'*r DTEJx -Qs V`Eb](46Yb# `4ZK!XA,+}bjqdP֍۞3[q~6xOUh9dͺUкʠ:cb:4,ԇnՌZ&r!Gq[7euhjeP1vn]"W`֭RBN5ht]TB9B1,PG@:z2j .x1& ;tB!ku&+߈܏j Y%2 'DG(H=0S)4b SC=yXZe{׿Z1vY:鬹`_U 8D0%ѯ$k-?V+Ԏu;`YjFK[9qVa8׭8lZ@v][.rwm[jFK[9q`JfT;YS s?-Ҍ)Abi'7AU @#M%Ax'0^kTˬD+I Ji$i6^Ix$M !Z)E5F;;/2(JcY `<\Vx-Y !Z)V]do1z2b1Ђ2.bMq.{ #zdڀPRJEՔa'&KTRh/$ڂ`aXȲSbCY(O+3}Р;u3hKa{R2$EhY73s+7ܟkpR%|ߡH>O+6{`e,3A%7{/ '7]]3ej4Jjɪ5[4o8J90 T);v[ѪӋ,]A9YzEWa3&e3a3U@zuI/'T)E.זYXW@Δ{7e~*90 TBFTBig <{!јf{Cu`ĚYEt&,B.Ehq4A&V:|ȥE6pً&g$ 4ɂXǓϹV(gr9Ƕ+ K#q=xtكKst+ɟ)B^-z|Ly ,5s~:ʼnyenf㾛N08͹PuZ.l4\T0i]zԼ[s 8s2,ӦΜɬh_-+ ?K{]e ުx<2ZoiDT>;'4]>pJic3 }f$q?%cxsg[;oӘ%NO{ݛ8WA59P&{%UЌ_ֈf-I'q ת&4+/2KELʸ>IV*X$;In ia߉ bn3L(cjr$sQ8Hu2s(> 1@mwi?% Vp4i 2N"f?\%{5n55J(յG3PN\k| a{IJi),iQZ0MN .(1#ϊJf`$ͮy?^¬AR]lP, ̓`1F-h:}Y͘Qg3d9!)2F[U1UK FyM$5נA_#+#;VFaj?wZϳN9B\o,rhG,iL 6VߛGxCpBRgqU]J%x*A›a<.i4%Reaw`w68Xr=#+&\D2s> @ȇy1[,۽(}/{xU殂9~NVaUJNU;8 GBk|\ o6*23>s}x=cF3F3=LD OkQ6e;BVj㪟n*5~jGsӓbZٶԶѻ SZwN*Wb{;K2mjb虱J.];#8G qt=Í<%ikݯWVrLFIe6jZ 5>dkū3XHVY O:3#9kL#uJ?5&Ϯ:M 3PNخyKPB6 .C% B^)$s5^0Sx ,H#fҊ3DV&p RJMߢꮔ3Abguq"=q.HH9Ċ"r&,jR-9bDR!cIRjYd"Z 6I9MK9+VDl\NIjTIj;JR1JCaL*Dե 5 :NJcA@9g68h`{_j_4)<%Cvfi;fG ŎFM]P*8d.ޙ5) ;1HEMMfϕhT5E\ѧI Og_ήsLBI4nŃ$=^`Y\^ &8lBdIi%@ $SY-r)d.Ng[lƽvs[r=KK4:hɎEgP8YB9DAU*'"nu^]8* ,4` ,]θ5n%c9.( tA h.P`xSiDiQ\v `x+Q@ђjnyO42 ȹc-(bCדĨ![f1: Nus;R.Y;OLa/>wX60-}s&< gj$)lBqwea(Ny>Y3J>~ҝ<5a#5Jf-fP]}B]AxM$|NMOkHJ@=~- >P`ॐ^c49ۭ)ZPly⁺RRr|2 p^ 3p&w1cekOx~4wjDzJTK/3C77\ /Rݦɼ9Ġ>⟋$p?F.>q#LV~J&@4=&cs"v+O~ނ@#A3^Y0_G3+YOkz„Eb0MHfq,$`ީ/Ltn;n; F!7uCzAt[7Sot$fQbiGu.a!:bFy ~;ˮ8%{$kv_0%-3ESG"c-`mT0DJ%F o8# .w(}Ǭ_B> [31!6Nz@K2'9%1Iϱ%(,WF,@kj=$ (/poݣ/jbaMNAƸ1b#sד,q&9؝<:xPD0ǩpXC*թǔ8a!d {a*=kdzZ<ֲBjaT&(GY)mI4vAT-Y\)ﳸz$^ɻ .p'.2j;½PN3aR Rk [AH!ͭktn\XϥS,j`-6S9 VBHA;"B{Moq+ OHBڍp"n FSp6d k4 ue@#`\ZBN(؋rTm`^*,[j9CoF'Oa!3$`8݄D0 f ÅڃyFӆS(,HjFв=-8r`ēJRdB %,j-6"1CKmɄ2nC\0<k@b|n!s&im+*CBR `33onׁ+Ύkmmbs82Q~*]Na AED #@( _vAsO##7xTDћ?ў zdkiNON* Nem2XrKَ9 mK\k8( ;Vb5;8arv\V+񮹺kf1zkKOLbi'98}l-3^v.^o5^^*Pg/{[lҥpG;/i=v̋N<7nX$D+Cm61D@'rrO$\RE1-(3eĿb~kU)3G.<)0ˎ ~R!J퇲.p$ˑ2 XpG0 cʃ2s#r 5.7 tR|Nt1Xb0OV DŜfo̽5kՈܯ4i5qGAQB( SPSL)F5Mɹ.'mQ* JK3  \]V1QCrhH i5IGIrd TdsWF0aFУDV )9f G Q)@Qc {odo̯Q5AWuqT.&)e45!h>!OH #HȬ&,\nDZuT:RJgg^ZlGF)Eˠ֗L˞ZgM6 \>} 7I?oVl~FfTXo"# > `g{\P+ l:J,gl"2EHD2?K>bO>_JYNԿzgipr2&>#V |VwZfe'=Fz{N:RYxwpkbrla: 8a^P˞n<:=߆囲>aJ%=Tcmb*m'T-Y*}_gڦe3wS1/Y퇶o'>dٻFr#r re,>/; $A|:nVwfnز4n,زbXVUӶR5=N5N] ^P ,j!F_7t !R&]B'U]!I7LHoFr 3%wqKXo+ lNO`#Q3J1Ҳ[# TԤ 6huRA]E%A$S{9"|B89 `06I?l^>/ҽp#:}9#Ό&:;aPOC181獛?{2ɈKГaug26`TJz> &)B\)g-dGUPj\<gLƕUVUu.Ak r4nBRh)YU~ķ(ewN큖&\2JCPyu>vK޹vv Uj3Wmۥ RQ0J[!F%FcPuS!-*DKJi VJ-n;tAvN~f|_Lwٵ>ΌOR *$ pé\]IJS9 TDo/ÿEfyu\1b**!@WҁQXS7c jWO`- ʲ!6T$S*&>|A'X i#:B(L+ى" SPϝ.É0l?G-1)[og|q¢;p"] .K)FHϳU:t׼1XjWHJN؊M?WM؞9F=\?xp3bXBp/1tG1(@%LNKn|Sм< c8ɉix͇ψHOVPN 9.7B=<'~+6~F;WCޥ k ^g0HxZف1{D'S"9w|B_o,Q cu0Y Q quTNa|!9:㣽`D;iLn =%^9m{je%YA{IĘ2O G# -9vۂ.cQx.i)@M!ѨOǡńd2rhpi 1x8tf[>5} Z 'IIK/i ZZ\y;Br#oﮧ}:{`g7),,^-[ˬ!xɡ}sJp BV`BdqcdM KXfC'y )I1a㱢:4BUκ* g_x& S T=:w d&KpǓLO;R[3q^_ !hp8E?zXOnvUsBT}L:'3+k:!o[x'=i4y )6jK$Qi%IySZFN j tZhK¿杪KU3 Nhnn?w*h-o/ޑ0(nMe~[ONMyk~PoWGp#U<@M* u_dQdmquUJvlW>~Gn~:}X5OgD&l w;n͹1_9_ΤnדzCyJ)*Munw˯6s[O?nT '4/DNF2kw\_L9V-Α+ݱFZ݁WA&|Fotp<3?sòkQÃ$}Ze= P W֏N$7>?Gɽ3ǸAR:{:pYz|61,#X 6%Tڰi&J]Bm5ڪXFѦV.kVTRm&~g+?x ice-ҚX+"ij mt6MIca56Ц)+3g%p!`-7(Rjm+㤝Ps⦒ϩ* sR)1 0~"7Ou%Rh .)$i-c;NGiA^iDS TMS01xF]9+K-ꆔ 7՚*N5W}X q*8`H%*wz3rRLRs "29'.۔@fRR71z$%-v#Dd%(NM*))藔T/1&?Re <і2 97-„4zZ\1*(A#/g.DʄJ:S$>[2~cPG!Q{ogwM'؀\\O$0Z#|:r|2;|X-' yH ~,z QbÅ /AH8mn)(ND)Gb$QoAã=їDeln/jE6Ӈ5֕Bⱙj+#чhm33`-nk,]<;!|6D5-fmMaFaΜ_ K滧RISUPơVaaj! 5Z”i -JdgPxPDp.$a2XI߉:c#kGm%(rNsӬf`}$%t"}>AHq*F↭~:w{Jj s)`0h C5hLY%آt JRDJND cuٞ!XPrj'>Jy}y$ZyAG HC|1PD%seVHyT^,DNxCt|I/ 4('G|F}5LHiל؏#49DZ\Fo`}Ai~s$kL˱o y j"y;e9?'C0 }נQwD@qQ"Z%hQ FDZ`%(Db⏽3TkZ$zN_fo#1ReX%ZeOvowG2W[ӟ}|pS4lw>(ׇ;KΒAd,T,Yq"q'iaXJE 7 Xn3Df(z|(UqE꽛NѢp0^׊zhYOHt3bY?:{ؗ+~ozй\{Pq^Iz5&p 8UQb"IY)CiNZ4$888E0QIcZQ'ZuІ&Mp鋜dKi-/Y AfY̲UUuTFJoN)T'Q>VB yCdF[؆VɆV c BG*6̈O- N>fdF]8|! F4^Ԅ1.dn+dl>z&FJCBg<w^QZĭә(-s2qny҈UAjcq,r#<2 ɉ@X—%P @C0dsHх{ 5cBNHNѸ q,wQU,ˮLU)E̹@(J|o7\<JxH`j#=hI!W #s:D= pf5B{sڠ=My;Og ;T_?r=+O(qmΗĸÖoৱ4q[(fS!Nj,`cP>$( `aOMYIRI !FzJLQt:shNQ[eA%C2b!ZXh8s79cPDbR>7 xJȑ»x4*@kMg@sǏ&1 -rj#0Lp:d a]"XrDy<?gDj"d4h$K@N,>?fW0/# ~L i7rGs;g># 0ǐ<Ũ bfɈ{ghSRh ٺ 8Cu$'߻2 ^_ѶF_/e\sK3iRᏯc>vQyoVfG?"Jv7` fa?l[=!Wwnle;%h?܏F~FR#LQ۽H`DB/7/sR] 5Zie mHϝ{D8&28fIC>Dg<3S()>'I"n)°c& 2fӠsdb[1`@H!))2b䍺&=ugΟ> ޑA)87F[82ӑm~G@o8/&[|\o1!RbӷeQ@B i|D5掬cO֕i4Rm,4 MP%kj"TD_es0순V`k7b]HVL.~s+^_ ؚzY9x}B/(<`IB'SB3bZ'{18?C\P@i'@_*qtktH#r E SLЏI0~KDXM}7,At]@CԂgNDvFVeɃZ (F0d#&! 4*PF$>o8э) $f("7M0Aʶ!d 91O֔i2@!;P@ISS_"S>He4E@0D 4Y JqIU20ie[:  JB5HmZ$K)O[ |ĶLC8ᦋr@~" Oin7d~K 6cyrX% 7` 'mR`")ԥ-;mn}\*1 }Qk{Lt ]>d*q*"ђ(W5a09H]xTp ^YdJ4E&!f3l RYLI[tSܬL1cu*&#x-1qjDf:/'Ԛ;iIG##XJ0<8LAt21g=1igAy4PsfD|wϨy1ţӂ XJ$Z=xIT2rn=p$px՛S$Rg0R%L3$ QF#rH<$ xr$n¹%Ir<..#HyfC dk"0I P|ZZq \aIj(s*p w8V5>>i1ܡ!CvV-Y 諼nΧ9vh؜q}ף}D{XͰzuĐhY h\R Uw3;g"L+hekktX./e"y )-sycQL`n|5柖 @+IXlٴeˮ\Z7Z`C@nf>PNg^|ǝ 2v~s}q^⇗٭\n; Z$}K;&ji+B㭣7MP۵3쉞kP3GۗS,N*́e8{H#̎4W4u)oVh~TUu<'դS^ lE^?4?[xՐKv4;%g.xP9eQɍ0H܇poq}mh}>ap}tYuK^Y)wޱ"ihoǟF'!$tQI 50% ^h.D r-`\OoQ߇g{{ovQ$%^"WV%vr"cJ2m̭|*E:knU_啓RK(LGgwUXWBmbsEtԄbLd2>6xNX@̫욍hi/5SӃwM'J3c_pcq ^kbn !SQgVlcTPaV+"LذvUtzk9B 4]<)'1a A,Q1ф*(ܶB(;OSZ  >j1I͛v/~-S/vw%'G_c?X-^.irtBjPzIv61l8sp D*VOz}yE GK=Q][שqp*6E&J>h#h#0\nk䯛w tNG1Bxf l(O2yg~5)am"-Nx гȩ';cxܦܦO {DA)Y'0p.y ECJ/cF)ꍐ&mҮ7הmkf}혖!9YkϋnCLK`Z.>;~y [ `{i/=` [ z|Z9<7ibM}'ͥ5jقVeIѵ3T룑^9L#)9z:k:S )PvTنg u1פ?9bqj2;\㻕?{<5DέGP=֭E(0p+I*IAkdl %ɉ~ Mu8NƼ}!Ց55Ђaޕ8r#R&ռkƯ< ]%U= J*UJ,1̔- G!.GDWTt<-`Gß|:[{ʹ, 觊EB.B;HgْatuRBӀUΦp) ITޙ9h͛*r2%e3 ;c˥ܗt7|'g3)a8u3$VDOE^(#R1`=EGFP" 흌FQfʘ6BhI'2,dA6 M4 ӢB6S{ܤ$:k\~57Z]گ~|~{,eP??<]k~V_ 㫲+-F}o冋= oq VskDy{h<$[B(.4/{okJ1*$__}K>??䘤}rOLMa) pLuBRU E~Zʞn01̥Eɮ%G\T%`ZY/aqmj) W7Z]ټJCZ%5k- pooX,04 0r)^VM\Q@vs{6I.qJrwN mBfe>!}Y~'e2Zb'j><8۪֝B's8b.IZ~IBē' rtW@]Ւ}u 3 IDp0ѣH"Z%;QͬnU!D@p!@Q`dOEƭA0Bd 9e!JzJd %R=h(qAO`b>0A,ZBaآ(P.OځLm01\a-J.j1X"kTbAX* }3G3TвKNjH-AHW*PU1f>jecË&GD,V(LBE+0i.8+%e17j힪> Yeo4\`yYk۲2w<ߖ_>/5=Jf|:ט¸d8oô[ո)oqo%+vXⶄ&Fy{e7aI5M cɅ,NdW:ibߥX}+b9ɔ8yP6:\uٍp ۊ>RG&d𧰬Qiɇc*BcCzb[ 331"YEs6 0nKGeT-}iU$R'"Lj+1 kdɜw!W ncRP gb=>"Ӝg߯)눼ZpE^cB{RF[m3bJ)o*a"{+ otf,N&/lm%ȌΟ.)[M٪#( f kBXO׽wf%,oӀJR'eR=6Jj|d#`4ElWݥkN= K[[; eͦ DIN6~tÏǜnoe9P]O'ɞZO_JbZ L?`A% ~ A={?h*G&Ht`F@8M!v>& {y0wS;vyq ~3ǯ ?Z(&&vsǯ ?N}v^uZ/ϻ?@|AT, U)%bE%z+_dafoa4Dv<$jX0@33"$rvCSէS˚:!ʤ+v]&X:+,v^I]?\j=̿:/E)Twjojqu7WU޶?_k?|:~n cr!+W) [מl;cs=gQI +EUwx2Jι}?oϷ wOv< vQ@MQhwFl( !E!]E8;/ FEש2xQXtwX.F~ɴBHTkqh<Հ6:5[#i&SV{8Iγa8І$WG|on2ewBSs9ni5GjGӘjy;vp 1,zjjs'ڠV1rV;z bE WtrփyXʽaJ^+>z(//A2J]\ ?Хd߭"߾*G3ۭD㝹7%YFۤ|?-mOvPEتc'1A޵q$'ni8 $_ cIʶvjHY3=;@dTկ2\*ߨh0 ^,0]8 8NpM䳳*E{j5=1Y*wIE7/K-PYP^|W[axi q,0qplp_ l 6Ь3꟡(Au+>dybK)phaX#tlrSZUߘg% Z,DԌ g,5Tem4bȢ"9 $CjcEAv폰[M?0a>7wRnJ`fƋ1[Vv cꨊ٠"+HUknqQXwK& IDD4f#RW:Zp$vFN{$T6 aUc A!XQ؇iׇC_CTQWHIh<21)zt9^7n^zPC[U Ikzfil}*u{OUhӰt*j>D$16=~15ŃhgzxkI=ƈvlUn6Z$"ÐN$Yvhpk{!JKp10\C޻[ #n:,APa7Mp+HهzUT4 gz҅p[prQsqKfަA jvE%(~UQ @((}L'/9B(N.Y;_! r߿)_<=8U-]Yĭ&,bP/߿~Jk*~:ΥH,:HXI$EE5&c\d>M (K% ŕ6+ʈP'(ػ Ғ@s1ZۦD@J#Dn]ÊoNC.?OݪV~9ɷirɯ7W>}Y~& 2K/g+_Y嗭y+=P pAJeG^K-4'$"*@zM6rM(S~1 X-n/1&(CT{*m&ȂFR$x@Y|J ?cj=o`Z!s!Q5eA8* VJcEHwAXA ,ُg /O ]w* _i0iO?746x.a`vh@#$V5i [T+Ę 6pxi[K"1:$ZAյ,7zia 8#4jS͉*]ΪDu]jAYia YKȥA4@fQ*`w^b(`EG)G4td-\r_j:,L9tX$+s k IT i>TAR?͊"*Y,.JdPM!R{C=nTP$[Lx^|v9_OwFL.6/OWa6/\F/G-/lxeR*&[oNʩ.eY5C_gaz Rrobm6pw"Yx2P;!j^^k ?37glFZkZ ?6,KP.׌w/,0-u.kۻe3ݚА'Y:gM[ RT'U[;)6m{lFS[U4KݹM~xL$N3*yv\ L9λ2А'Y:Ub.w EurQŻW)0-nU[U4Kj߆w}n2HQgTnUp L%̻&KnMhW,*ݏתwz/[ RT'U['%[6ޭ y*Szfû)f+r{3*D`ݲM>Ҁ!,y JKrzŦ4xzDJ(R“є1 HQ%Rl|Wl%zu~}٫3InzufKRCL{,6"q 7`Tkc=Ĭ bZR h'I!O!$cD)"X:0/8HǍ1 B Pf|ZtEv*RE"F.IQI4"I c `)f3EP٨u! 8 T +#63I',^b*")B_طuNoS꨹EG>f@N:|3f,S<`EY5<2.Mh)R ADeP:SDgVDda4 ;E/ORZrQ!F6DE3 :R)C"a%^ X 2,9Qq AAX Wp:G!,]{jg`˰#\/ eY=sru!#mQC|<ċnJPW -_հ8yQztؙn"a]׫RIT?"~_tvlbBJԺWe>-435rXLf`ލ{,[!޻-'"c$|ݙ~RA5Eӽ+P?9aKojYeYߖ>Yٲl&a~fm< ;u$ޜ 6n,T֯g5/~RK>Ï"tlY$alc8 3z3f%#~)>ٯHJJp&ÄpE;F9@E#Gc ]6)$DMS|㫊"oa 3X#=H!w\M! |3Nmf~{[غ]~ּ,r,U{lr G*$u|6c" 1vr1Y͵NYΗ9bҫ3$~2a,XdѭJc{aJ{+b)3(N"!*P !A! %͕,NJ}0~Ȯ-RiFqr:A{a#QfCZ >-b=S^gb퍒y=e_fF#ߦ靈Wo eXlNt<-2;O};GϢJDZKOtZDF2 )Cq aAoMgPNk;Eq4^{Q:馓sbt31~",@N-j".?6k϶.Ofjކe›,;[6ꆦ_\. eVYhqym񙿍&WW𯓇lhTA,oQI8[ꛝoM't~ {0F 3"hR2BQ3֊HQk]!JZnj?I43DCw 4)rX.'S6Z; xW"ƫ#`!)dw7'wO> cObUSJ`!quX׽Tj4N S HtWX# ҏ.Jꃱ8 }ПtrGY:RAcA)eĶ _}dA 2!3EGLTЌR6~#Re$q;*#-%u*YU ::j0 L+ Bq]4G'EMBc5* :m5 T꿟 ҳY0.MݗL H ^)|1 l!{ňq爧)oC7 p8Z99EzAj/.R`nl߲e*ߣ|Ƴc?2|/$H$#rTKkxWW#vND,CiC9oTBF)1qɊV`Q=!$", &BŷO|!B9K ,',/Oj4pIpEpnJUnaz4C_+O}}u^\]wVbϫ]cX#L}+ǟ?K"`,gOJXb;[B T]^kbl><;vQOTӉEA͢ p@qVyΎ} tQ8+YjdSVZinQҗ}Te6zRMd5_o^uh-_G#6> ccݗn7_VkۉEYn"L#Fz3M ucޒ>dnBͮnS;1"dp1~)$~!s21eH ]ˣ3.~G!U |H O.)ߠcӯ}iv6Ǧ_G<?Иlul!6xH#MN9Ž̜KAYF s3*U u qpb4n2n8~쾪^^8WKm0rI%(#ec `qphdy ٫muZ㫦uX Sq.nif9+ᕉ<%$.(,]O.4̸`,:[ϒ|APP5v8ba8nn/chnRߔn~O=2Nf)~P5$@rNIK ]& \ fBdpQ*ВjҌUWO!O kDn|J<- MF̓Qݕ5W*ku$z!7T ν<]9UMVM1"1n,C 1@PL̀'_On)- *;,R mm"j砤^\O8M=ʅsb-*:uO J[R;W.V vLfm+*n4%>ޞragh,'w9=/69^4(:? |7)>OmnozyyzxI5{ ?@B,ϩa~>P =C|A! cUw9x9? s@9هju-Sm^b%|_ %b(=Cvڭ&IrV`m:ĬehN6=&?VH^ń2vu e#QI8&%;W+8pȖ!ZokNņ#ҕ/m=x87 D^Ws3ˏbؠnf^-ůeAE.G,TG,<˟ /neRġDF)XP"z10:dh^PHϕS 7?8cw6X{_oAgk/}"_mzGxiZ?s@+H"~ 53]{Cע*2g+v~/a"F#9nJa٣$Z4݋%w۴L_̮3]$̘}: eڦaA f:%9PU1eMYJ 2H FC\pP8`Ie@d B'h<ێ(ztb] A+^2uC^:oyV}ݗR2"tܝ|8sWng7VI?9La?EW FBEW473i@jK~Q2)wJKVsN=syhk ;3E*ΨJ*Ψ⌚83Ood!pKKw"eKp$D]xP\ ¶>^g}xH[cQp/Bh v1d7싺QQ5"`:"g%Ny4Kc:&f˔d.rBn oZ0=JkgL3KepNiALRFGiT|QGj0i5K`D:#IFeX`)fg}6!U hQhdvAtt߇?Fj#'ad^6Uix$YV6kx"|9ed$Qk|a(r6*6@xRkHY 8/˱M F}-B/9<]KltB}}^Tuo(}&f2RC&#bdg0E|cl^۬tA>!y)ZUh/s, 1 2WE鳝I0%e0 œt֕L[f>JF Q(FV.aVN'U>UOaB2˷Ll^a%8˹4y(ɕѓGd,a0= mGsxYZwXCnּ~ 2 j;u ˀ k:C @+`Uǚ>R4ת p( ACBdLuBD92%zGm RV v cwDI _"/u r`5ja^"XT䊔yFQd+eFPű)h6ަ'y0BSiѻb(UE CȔ F1FdZ2vAfYE;K̅h7ܳHB2f"eKd!^<ݨ'IkQ!$߱z2Z2Spi̋PŰ};}p+&:ZD-ڤ"ॗ&s10契<9{ a"Z"َb=X:+HZ ȩ+:Z,b) eɔC'Q^2uݘJLa+CHn[$eH~&b{0Bov3L͆݌yBů1X--Ok7qS­)F|Э R'!h\rqp`vo/?]x M6F}!ږ]~Vb%b)JGw|psӿzX2'M|r(bj~1tᔮo?Mj׳aK`w(viɊrV^|n^u2޽}|<̹|s)L T'(4f f.ver%yx1~dE4DfYDZkr u lsW[Mp=6 i\ڬi_@c]2Wl1u?=?G5Qn-zg$),6=.YJ|UǃYKaULe]>w_o?c"3\r}ERXè7?6A5IУ@Fh',LmRܾ*Fi;P^ݑRzmx69zP/D7{}hE[TWy?F؏5{ͺ"1~m܃d(o5wśj&*|vG]G6HzHY>]_ރtʹ3OiNTۖ4"|%?^'p1o1d%KYH_ iGQ2}r_ĺS5uI456V\$Ki*PJd Gc5~zkVz{-emD-=2gg*s* 9Vd%fZkVZke26;oﶧQU=UľwͻX7u5yV]-jJ.Iq^AXM=מq7*:.*5+<7))2;)2:sl(nPqZߞhgZw#V[-LۨQ1mbFMl3AdkL6d|_[Q$s L aK6,CLiߺEPjᏆt9oq{RQXнmt u9o͞ܪZWSHP2$~}p2XiEۺ>=GcZժ4$|v!2 Zv n'/J'>Mnf+4Js)IK ᧤(OU*\ U׊J-}~o^Q ή7` E[Hvh-+E0l[+6CN1V:hrD۝C3amymKkObPZ` ,Y4 n\EL;tO9yΫ|vC.wk+jY91ېn?7eUlɭ !{Eq rXM'قnroo;+P#3V7' mTPAnAJ!J`++ @kU<'ٻ6#Wx>} ޗ`I@h7 Q PclFWe~YYYYy|wS@b̰hJ4%_Z$.svwRϕTv@|TDo{ T?dt4 p* 8hZᄙ*cpjN8L 8-!ӣHފ?LwEy-?vZtMid:M:I.|E=LwΛT[ئ .f%OhZq%!cIsXPF$G=j܀ Mmj~I9 U5dh^+hݡڔG:;9Gyyj$R'כt_Kۓ@pJS{D!Sk.6,C:Ǭu$L *Xkiv\1S3n}8XQHoFH c,1ޑU&4t)%b{p#sM5͈!Drf2( &q`)̢YYT4p[|efvNaTxSy #¦{_Z J@g ~u:߼VnDmZ@>FlC4p }+6$kubH͊/ w W/#?b(vl#X)GЈ;lXrtBv" sm(VFMJeh<Xx1^W(椝Es'U#JwһĹ^\AIUl_fY+qM.I _,a)%HD;p^ #ȂL*+]tRDlzMY7~c<"5!Ս+%uJeJ/%`z8tI݀".*$v.DF\XFF;Ƹ׻̮ݜ{H%&G=m(jԁ'}[uNRv ( B^F8ͫTQEu-}:Ow3N/R{Er_$Er_]*O2`#3 Q, A& @^is^fWy*S{˼BK2cyYIIFvdh h"`y(=,Ȅ9oЈ0&0C1['>SKk*4{ M@h\>3oVOWO%KPhPP0t Ayu_)bs% z:(ÔXKcc\}r|Sr^GR z#6|w?dԊ^%Gx́>L.'w:`,CIzԩe},ݽ '.|0Eky2`eghw%YjN6LmwOk>pUDyNG5N/{Q_p "M+fr:<Ӈptk45b[a=ufsDZ/!K%+]FNIuЫӲ!s272=ƂkI^K3z5}5QX^hi-O\7oKLӏm&0(8|5U74w!FԕGji/󦟣'Z E1ؘN:BN3+9_4AG/{(K$/Uޅ̾O9GW֫kmʊ8E@67 X=qʯB A}L޿MSgÝD)\- ׺gdz8}a<#gnV$JR k͢|6wzvܜ@H #zcxE؜#a %VKhlE挻B `ԷGd% E  k,9#+&`w\sWb|w{͵rO`l{ IVOrUIŔ{1*A1R{CYM+BՏӺ=8)Ջ1  Nf `nMJhWT  3Np05şWݯx"\j6`AA/5)Qm CmNW:\pkS*PH\,q[ӸR3"nt«-Ral"#iB ٜ;gQu6oRݍ{H|" E]u7Vʤ6b,2F(1k c8wNљ(|C>2Fp6&$hϚm.m~1nKĸͬëi|K9#k,5gχ)a9d|X^)Qy] f2-#&q-㡚ѻ!=[M= wi;p.'KөiR߇tq>L3ws[G>c[~fEuR0k0b+7`# z "9XR)dKq,a:|<*<vsbrbۈ9d O A h7i i쟟b9mWT_4dy_2u@S(Hqqb6q5ukbW}NgJ4>+9+s{FGr;x$ (ց81|ߣ1Ck?gǨ`׵s+͓WjP)?ܕϟ;,1ƫX05؏77Z_}p2cӬZ~mo j 5#ꪳj Bj>t2zBP.OP/J";dt$awtVLspzr-CwZSZ-v`ϗmtrsuXT=Rы'k"8?+ʬҊ A"Je"E$uG Kb&FiDN Q*E6*CF/ 2 ǥQZ:PQDi2)B.\-zi 1-z)NfXzG|ifT쬥٪PY יcR^Yoopl-4 5FI?Or0=S ] Z}t8qI凊s7+-b)P&Ij^{EWyֻDwz9 79F2fY/yωf*ᫍ'OmSYrVZP4CqY-(,-(Oi҂(Q(eޣ 4)8ߘ9Z҈s˥;&(-<2 \7'Вn 6b!`L).d=.?tn/Jݔ?t>ş&Q>n~;*?R~.77zy鲯tǮWoMWf8K)az`{k2ʏKy,s5(w-I+"M' v ڭ9S:Fv80[yLֆr-)~FUڭ9S:FvS`*yoڭ\ֆrݐ))Sx,YrCm/dJω$No$鹻n?Ve钯aY|<}6Ğ>K;؛g%`: qM^AXF 'æpN0Hž18^&x4fӏL2qLG2ɵRpjRgS= I5(FM.&;丅˭'5,< Le3 9/tE+r/RV /VYIovbj:#9ͽ h]sW8iiux<3iL/i4ՀxV;$/L;Z ($%#i v]pQ44{Thc,)aV=Qonx0Uj)!)b8mu/̷"j}t+0k~)<Ժ C#ZnsPHOb,t<< JS3mqH ty'_݌L[+-5zO*ӻ27HʕdԷl.s!w3`1S{M|<渟|OArOL;o`5P9E)L-b31mkKKuNgU3s8*^/}xF2[sOHty]bP ]]y}&z-Vt3ƻ>rW٫j?W?7Y.w mu٫ww$}$K ˓oDլ&,ZwH3 +DQUhUI&ɥd c \PeAqI.^fI:gzʀw}d$4wi Z{EPՃcxcXK<=[%d33333O2&pꔗBQT "\PI(m)(EHA| YہZ]BvIBYJu~}9 AUN-i=rJSj21_Hv˄eAǡ*&x5cDᕥD,eEuR6Dg#S>[MFz6aIBHxƥޥ q$dBnaHGQ5.q"BP+_wR+hOv6&o6{Zx7;W%`qa}mD8st$f4Bʗ:lG95BƖ(YG>ǯ~ .zeNol twcr9DQUsgS;dQ |dZEq SD}Mj9mE*8B!L>[VnR sS(XN%94w Jȷ Bl0J5vP`Kp>Qf@q@r1{r/QU=ڴa^$Tc;NsgtVW@L7k+޳cJ$iHm-})(CΥDBЁ9 5F*/N2/4y>H?{Fkހ5o%3LXȉsz9Y5ABo',T꬞60P#Bdi'T!<̎4 }"dc% |hkbQ,;Ӑ?Ap߮@ Gt+VaC)zDQ t^xS":IVKDÜJv Q4f^3㌲rZ]y"n:M(@#h^nA/wUYI2]Ծ]m,Y˷/m;V8i瘟VT?)9y( tSR N-Ǫr RL#k%TΕcB1}mOǪ!#7;:s<!kuv F2XylFloh.f|-:P^"eR!)P8F}ESVlxyTT&˦9!͉*XSv@O.B)PGyƖMHOe?%̐)3I/wq\:#,v#;)qRPl"P󲃀T NZH) v7ųp QyɔsYrI*,kfu8B)Ft|MaWX7Uw(5Tbsou|LjMÏRɆ?Q2 $q22KŔ̛Ǽah7TRJ @OS-B(أFx8 p'-c.XOp4)-]sS3TP9d_C Աvl$ڿ8i3x,0JOl㔬 c))38h4 E %7&E#'q_b)Z zk@ `PEKHhi.DmK[L|~?lGJxgçz urH"d|R*P뜑aj@uNґȲ')yv 7^k~gG7b jΝ4nO8_&07odV.fe!s7?* 4slٛ`Nd7ArD{cs2{?+J) W8y߬{nj#fj6]frEQ](kkəà ٣{q,n| {Oo eQG 1O6L C׉*]ʷhBB ]&\9?0d5! AHm”7*YPN̻BGal/Dzӣ5j.QJ@s_6àcVJ% "2J)e|J(spq&lZhzJ((8jI?C)DǝNw;]Ho9zAOwi_J^Y+MNY}cL^ZŜ(=[9!;l\ z~w_昖+r^rɕB JMΧˣl>_wzqnsG7|~Ad 8׺_8R%3g{!ɜS/Yԅ,!iZa1#2s(طlʎ:=9(ǎ}F'AF'd=fsQ4kX UQu˞=,n Թ:rֳ:\9><>_/ q2ƑDwC8LAؖ1~}n`M ipq<X~{NJfLo1wxG3tܓD-9 }s.}ݻ6^˳\>?y=/g>CyV|ڿe-ӭJVj'U;}N^ Zіu.mVT+"领 YԶi-tLɩ `I@{O!\wKӈtx4bzO dc:Z+y΅AZCmZ/d&u4T~:>k NxrJ9"syKe! QoR)\T) ‰b (f:0lSQ i_Q3-P @ a{S`84sFĴS SbW:͇JI-Pv2v:'#ui^pڑE f'P^?BUK|0orN2򦤑-1 GϧllI7, /吒 jR6A(6Nl!a FO8D/mTX/Ӊ^ye $1{>٫j?W?7*+.7eܹ]g~~oG{ oᰗgVƯIIs߽׳iqP5tqwލ7h> 5%+qQUJ)h2H1 2hJ$kpiu7Ks֫p\UD8&"sl A*[QRWHN D TTNQT)򢃭B"!]XVo{|yxb+qY(--JҔB8+ (r]2Raӊ!%,t=^^isx-7Nє& H)Q/G)}7!%c#ɢzTJ~2Sqp)DTpzՃ႘t."rR44Udy|p4Jɥ#ƕ3IĪV=|Rb;?HNuW>Hsk$% `J>0QaK%{PObf>8(rؑ0[aEP!xE6*6Zc¹Z1؀e(,t dw;INc[T";F%]!)O-Ṱt)8cX:Pr*Lt)Y9F^NAO 3m4^J H4|,Z8=_wNPYdl┥|FկW%Ơ~7u*[(lv.([:98wǼ R9۟3* +'%SbX&TjRUR,#r6+,)`DJV R9T@-uB.|^-D_fWcIYہ.W x?b.;5l211ŵ ai=lMBX֫ -ؚ%ó5rjaB8Sf(ˉDي(UgkCUT~+-r0u6&DvSg,*J fb붃8uCW8;?{֍ K/[ƭqQlO6v2/Rjs#rlI䡄Ë$+e5h`l&cҷd:SOKXJк8>* xPΤ QZɥEԹ˷VgW;ϭ{  ]x6b䚱vXdiwj.Xg#MDV"s V8ke`VdFyQ&(H펓-yKbLbLbLb:)A&pf,#X[toK4JJCQ)rV I%:|L2\'8 tѼ+V9VrXc4FWh R }GQ82ubI t!4T0i02iHZ6# rctқ}v2#哼ly6L}]H+,Dc~ 6h0&QtT%>#.Z&Z gwq0{ YwzRr'%sjF-wkjI-G;tP0/4q>]龹>@Oy忞OF:_  ^j^>T~3'rBTw߿eo~"=Gza;zۆ6Evj-k5*B6,M^NQjh/]\IL5)VIFd5()˽ŶMZ.Y[e Ym't4V! PMhʁSac$*:6hCjC! Bhq)ȗc$E-$c OdZykPa0 /$JF  N,Y3A0 MN7l+6݄m1'?!Yxi"UJӈ!I>- c$*=Z,lv1``3}=jkU"-{D3iN2 f#=mS`ݫozW=׵ڠN{oarEQ o&74;49=7{C<'3^ Sw]NzStO{σ1 oWc7]"E0\ ]sqlo,dϑ,68@П~{ )?6%ݱSR#?|iCF˛7\}eyT~<7rfX-lۼb~PXt h^:hP7ϊ=-7{COpT4|V~jVBn4N*o~4YFܦ4c[ao|TSEL>mwǔ>!1b"n4gYWJG 6HڈZ*Ę*! PAI^@6ChMԾMrK^dlj>MX^+jͭ~gƸt@JoK5¹GDDՋzIg^rqT_Y>0S0'@˸V{j="txjq hGo]5s=x3?\G?XXej{ڰxZn~7:MlEMU ]Kk]nv//9si)qmoYJk̈tEf"\ӟhW'CN!MWyygYA^߽^py;˦(hG<\|C*LGqϦ#]𒜰xZ)E/8:7]]ܧUu__߮Uk0漶+h&m%:{ 7HC~|RbV^ meedI%Z+aЪ~4ݸ1J.Hv0x чK'1_SѬ H.GrtїKeEb/,ʨ6Jy:JYz'$C5ߓd^~45/j )hc) \%mȝ0`1-5>ոLi:wq~LO_L OO10ׯ>G{1i/ Eo{OL3pZ3oyIly ȠTe[tDXbPRcȌ%UK֖[~hn^c97#ǸU>T+P0ۓ#P?%KDV;ҳLRjڣ\&A$GS@ mRl mmrF$1RC#wZưq&&k1`cqB wpXj"ĨHq"k(B9r.kG`<eJxs+#y8!xH.WD`Cn8{b+RO.vsUd!YTE@D@h SĠhtVC%KY ƌVh(I!Xt/3AYϸ{<8et!#2ߢ,¼բ(, Fnڞ*"K;0{pFk2TM)XiA\EVjI:/质"h؜|'RAtfZ4M|:6E#CڤoW ZHd!_*\ &_ƕ2qkb[uVC-aj JU}`muK S䮕 "rѫ5 i66U[~+i=C'R$cO\DF%~GrE .M$EGZ-5a,JBzW-eh#'|sDկ&(k-[ k%9*}Fd'%beCՁjkΘ*3oQ(sJ(_* vHJYdہE=wPQxm41&T$WC_9 @LF*N`b%v[&&P+[8 iy!x ep5\f[tc}agzw9RQf{)_uZzMhP si8Myƶ]|Ajln 0DYk.0 wG ٟf#ۇ}eOK 9y)7\w#~|`iiV?[ₐ.n d j #+x,W;W;p{AσvE<[+EHgpBXZPXē>Ds^+yvBO]Yj ak5Βk-ģ]i1q6U s}wWKYJ;ۺ\ r^w>B}5ڭՐ]뵪UJG< ^}~R|o~ݩ8rW5>uZP=ıEY-X/1Vv3ny`Aw,*ğ0!Os2US0C|֍O/}O@w3Ӟtwmק8r/# UN=ɚutD6@KN[smh'3uj[W JT;XcFRײmݺW)nАo\EWt }TNp8 1>ygY4̥-q+$N,M  owm.<^Cl x4wVuػ0n(lhGտ@D%HX$TrHzAhQCNdVRvZIٌp,M$l4\M=-3 )P[88c | 4 yāy8r@c\#-DrآJ$xA )Iy`S4Ѿ& k$aSbB;Jt0$|$Db H%ۚ$!=/ "f 2T&I&# cju 6/dA%*RN!*1NGk)O,ӇHHQ޳j[q u$d9cǜ)! q9i)TfpE}YKY#Ij[=pz@[=ꁗ=Dkެ5lpƤh-vr PPDu) p@o>Wp|~8pUu.v:c-3'F:qVyî@.0sSFv\œ47QNV3v,2JpK$h/㪺[g\Ob2jʸ*Q*[Wq+jYӁq.Bx~ <+dY?7 `N^U1@#$/u ~pWu Zj9 'b0`s ohjC=G'!g~TR)g6bh3R43SƏ5$($ķ1ىP0(uAolB,0s[hfztd^M`?Mhaҏj>?4dݮ8H~ > >*kI_F 0r$ٌ[ַv{?-Ii2gv\ P{  .}yC3 ^` Nl1-r<8fy>F=w4{*Mv!_Xfh%/kν1C^ro<~d:ئRϳpR@!Dח U\P NoG5O007&[hZ_{8.ӸUZ3Am{!^* ˎV%e6>xGV?-mpVfܷiP4P cF.dM,YP.Ε gWv *`D[ scBs'Z&w)b_=6|e/&^)zօomVתYq{Ln 2H6edͿč\9MW*r.|ooK$f/a!CT ?Pwlr,$ 2(.p6(c758k7_!Sи}1moPj-ZXšZ eSzǙ6PBR9G;pCymHbow(wCU$ߌC oޜ#g׃;@\gPPQu _60߆oTk ›Кٛ?(uƷ`}wMfm ԰t۫ L@09cz[`pYBn"QX&[p1oCus @\:UeO/1nڍzeEyoET71?C ,a!Pa"S_)GSaZ/‪֡`_4a@af8$?gzPp3ܗF $uR|!k z&_0?N ल18["\ ,p[$6lKqR7 :E&p@cB.:; yꤍS~'1=WjItpE ClV#; .-h:1goʟ6)vbkޮjq¬Ur=?'7o_<ލV#s}vVI'>q›SQVi[wbf53a$;ư?ʰ#3,0"ȼee1+uFe8àL:Q8ü?SsfCԂm`ם`s.P7]ܶ\1 õǩ/!nWF]k /=ŘjMQmhK7bfuJH1Vzُ֙XL%814: 3?!OJ3D$MP3TV$)kP26;='G(rcwjp N^ְ.}S(BդkSlt:Oۿ&aָNڸk ݅8F'7"FA8BlgI%|O)$'aĖc'm&:?\G{p2\B/0\03j3erΡ!E 0?Q˸C@]x`(m>p&{LffT0?f6 {|>M2#s>E_jndT6-Z&*߆go9FXݤjcӭ2qu4 F jv!#z@ck FTDq(ȀPB JDAIK4eɝZ<JA5T9W=@6KUc~ieE%QbQDA{qk!smOPٓɰsdx'0km`YgRSJ\߮@[גI7߿Ao^QC1 Hx5WW? J2=@p?o?iš͑V&7rB$G_{ㇿBiG72:6BxL'> y WLh<;L }wXPH[A J ۓ7b)od@:M !E Œ22P^Bi] KVJ0"m 7l--ZxXILT2VXĦ f_gwpEc:\BZҬELLmi4B[d oѠz;,bj)}tb,b} W=/C/?}Vz&-?W@9HA9hYGG}P΢%z@H2²n\S^32{=᯼WaX?'WtBj-z\酳ZVˇcg}6q$Le^@a/9w'\xwղ^Z $ᵭQ=Z.m`&AA#=z%|L;_M?mVZx5L}ϒ}VkՍz<;gS*dhua@@8HDN$&Wt%y/1Ni.6|F/2l~ݰ{|eY9jr5J[k<)?_Y|Rk#'F#v;UhcFaS9n Uy.@ӭsZ(Aѱo\\'6sVv2KA-lZݟGӺ LJN.v~,. Wq43 )!1xj$`0@3A_2@!!"̠s_r7 q.*ooOUV թ. rJ+@5nqM?w:Rt~9k(1DG> Dy&u3~i}|hѦӃ7fF"(vU5m~re?_恹'H>p}GJн$Q_@8ȷ4Nv\w-;H@@"dB2 <@/(Q*"zV #.#HfC7fsؚ@^ʶюZ+\3m vDK: N+2/8Z.]Kf~KRHpVAv.ݬL6֋p{GY$ݹM"EE]%_j<VezϮS2K&'~Gx|jz@SX9<${:Ab! } ːƘfhЌEn'L7ȿ -kyf:oFOoVLZNJ=9L$c)TEfGd89K| Z?7uLrR7ZBHOWC%_Ey˳2LHvV_>ןOȓeIU༪l^ #6j3ngudvV 5k,~zHwg6iڋ yvX)Ӟ֠3]Ws:gɇu4{DjC|{ERhpƪzG#4i\9 QWa0G,p}!Y=(-Q"YLA4QiQ!M%晱:ϝ| 8S zIwⴱ+G)hf :S|~E0w8т6 ZfnK;2Z; lD4y u2X5 zM(BCECz S3݊?H;|I.41M 3@z"G/i;4)hLM:h)Ҏ^]:|Y $ /@KgJ+Ճ4Q0T7 ]rÇ+̺7T)_}\Xpab?FXMa+reAw7eAg;,h DoMlE5},~0]k:nm/EO 彗:ݫ6s,c`L|wT$,t:,x.Qu2҃V;FV;]_M|DWi5TW'byG9?, Q`k-3r 70/;5H<8,?{ KۉߙnrLiܳPCZaλeMvTi6/U7 wdɞ:w775pzD늈<ݎfb|ɗx9zC-88K4dGp@ Qh[% *9V#j32[H;H4Oa:0fV]U.xuT0Us=_'?9&?ON'WorzVLӓK7y~vHZ=R7c?i!3 s?Vw'rqn;{}>9uM,2mсP4 ̂ B-`9wsxv}X<cV}3TcHFxvm>nhXߣ[<Ǜqqipgy.%68F8Xf8]%16Rhs7k c%m$O+r\!)l%,m} B/Z{[tEnxƫQ=yuچܬ-C9++`5/RM>v~ss(Z}uŌv&JV;oAۗ}fFz$\jK jKy T$={Jݬ0ѧꄍdrĤ#&RsjED,A5lCV,grV,gϚYf$C:Y.Jh֑V-4 -:_EHnj t}P*]+aiףk҇՜YJΦ+W ^ <&aEVPs]9"@5OVkM!`Y#p) cO[--Z^cKPW{ԥ4g=ZC=l/ep¼G?FE̮/#}?m9wFak׺]q8E7z`=8. qnj+kdC]p<&vG+ٷ8%|ע,GhtCUO뎫rK Yc Z1j%$ }+DCK~ ]<鴛1Ч.ܞ gYp{ּp<8[d-LAh*2SG5F^0̴ FLn][ׇ,VWXÚy޻rmTŦ\%KruE~)W+N p\ȾRE=:>+pۤ:,#_Do1p>i5z;jd~DI,AK*"wN1ZsÐޑ^@&T=ӳ왞e,{gMϴX6X'>I^ ZX_CFEdM+>; `MChE+9w^b )H$XJq:ATh!ϓ$^"MVth֢?~n5 K C SM"I 1ښrP` ='G2*fI.N#xvB-Wt]U1i%NZ%m[UzW'}7~[S3u &kzU93>%,`9QIZሁU慇 CTL-b 5 N$MjGn ,@.sbիmžf?&OO?ߞWB>iAr&ޞvЂh%6GS'Yddyy@9D,c=ϡӆ$͚.BdssLj2jHϣқ9O[OHQ!j`'mR`^QǓM `<,zL^U&fBU׎ Zcp5*6҄Kԭo3(\#&#DݏEfn㶯e}!Uv^sp |brh,TЌh,3AЙS%J?߬ gWWDZISksm%HĹ3T)`P(e41XJv@pQ1Fㇻ8T׾ ӳ}6^?-{ކf Q1ɼqLNA]1ƃC~ {|g Z.>ln;.r!A~J ekDp%KnD* IEyPtAX?} 9%͇sJ^$$GT~9<kz|w\y{6P+ooÇ]xqOoC)0wmA<9{p"+ r{CucX{Yc_yA*tNZ b);<}K||sw#&Rxr>jB>!s\.hHH"4{>DJ1G!tYknFa8,)(\Fr&oURXD<,+ą@K\R8Zqz|aA% 9|o$DX1Wc,bQܼڱhE'%Fo4!5 =y bh^*qQ@5BkB 0'="NP%\; u|Ov>mpr @mR{nׄtQ5\xUϡLvm1n^zfM7{X36vqlǾz04~.-?}&p7nM%i"\$9^ZŬUؖϵ(C(M\Ԟ˝]T\ƥ!o\EO)Hġ*i\lS޴2҇C[Y[4?[@xI9|5`ӱ l7~4TH[ ÅʹT9_4LD>*)Reqp3gv3V${KQ1԰2vb'$i0~V-A;.J ^^4~^K s}}b)كR3 H{kJ&Rjڞ.)p+vlWCc 8^Tr]~|~fnҧ-O>̷#u1OL_36Nd͔6#75 *Ýl:L4jifQ戕|N?-C>QvpTjrq`L F>5RLv̤$XtTȳk9uoo;fPlG7'Mz޾_$HʱDGLdK*Qc\?G@ec@7|=Man,tR Be*&)f rSb$g8X̿E!+Fz|5[IN5" 1B3ce! j 9@g8GCd v`6hF̽áQxu@Rj*rE{0"bBC`9q`ZdGb\)е=nFBPIass 7p  '3/8xR>cA ۱GׇH`XLFqٔI0!bs/ gomN񧻮V OBig&(4>7574b^W|Yg)ì!De7ktE.PHxOBH⤑v{-Rd=~Ia5=8Q)ߍ4aTr ;?;$sʷv?_N,jvK EpTtN acf !MNZ$Z( }o!dD8wG]8+3|a'W놇 6VA.v2wL7p񵜹#]䤐Bgʧ#JL*M2$*l.-ue-L!/n0-&3!pjJB/n_yoUE q>%\2LD!r<;uЯ]q:!;:{+vK: Pd絥[[onnՂ?d+ - -BTL1BȔH:%p ͨV_6~ˁ, zz`u5TRX~F6\0POC_?6d eK9W AM7CΪ6wCg/gJ~|;nG$ n#L">|.L{_搐bHw; +~zѵמa Dr=s+ gp9}bA1&` Ή >T ͹?ābW6 BP(#se%81_M`%mp? GBb7|t̋E[Jc@g);acBt@ :=@$ӞhmJԚ%%N9[z`v\ rv;6k4s#kSw_خg+'O42dFrgk( i2GH!\řuamh[@ږO4Zo9qƂr'Xcٷ^ofz;jv5X +61#ZH !&F(# <:$4X#EQ/i Tq8bRNըUˡ .FLHLeb`}ĕȉVbS ђ1*Vj w1ʷ&=;o1t9Flq xYt7\: 8kZl[Oeލ{/ƭPǙ<<.م3uD|?Zp)[5^ux=0je遝L_3s?:,[T #2@Ddf[{JXCHs,SrKaQmjM/uaZFBk3(DLRgɪaZU&4xVvTN/G \j3l͈F6*2PALBxG}<#Ն3eb'^[VOcEwrԮj,iooIpB9bxg:UC8Çݮ[Wt}jL޲<ɸ>_:1I x)|`hPu%h]mm$tUЭYAI,F/f,-yLy?}D[mg0),S (%rS| |HÇOr iwA|^_21%5D h.h5j>i;Z]t5`Od~`*OzkNe5 $\Arb΍6#` + 7-f~x.V H~~ŕM.q+->)¼ Y~XT?s%m) 2H:hPfmpb5ngUAaQ;. }]7?^u@2Z&&^vi+Ra(YgLuj\ +ܮ­ODJ`-/OVV_[c?ؒ]u_.HklQCQCܿD @ۮ09JcFCtL zR]ʉOp&F2S5Kqөs/)4cu7!F0A'nZOҢ5O/`6ǰ@nof-0Zl@aK)\X#4::'[}WO]KAVO]Ùc?.W-x} ٻn$W ,XɼYl`3F2x,',oHnnI;InUXUHno1&8=Mו,c3 tYNsGL}.Ai8㰝8_Y#_V L #Nq{vLvSϾ'J=iTIROI .{d::dtGgN>{ٱ6쾍=I*"<1R0D|6գWJ\=*qWS9'.5dix^OA[F2%'Η}(qWI0,S_J-l%EEo[>h9cMO7vSon}w}M脁1t"H&JK0m}ʍ#3TV2_KmKm]0F+oT_Ξr"SՒ8>YGEQQ⨭vV2)J y %ա$1 !#PdvAhJZ)B94"HJQ\誳ھ j69S#ЭYWBbkM+1C1sz)3f!HPFAώ ɘZP.b\z~I^ fx/MoynrV*0NReQUt=1ەl*Uxt )ѓ5}oKd# =h=~K U*|[-6k&#+/jhUSLѸd9]RAiRBC4NkQuO L;<؋ҩ pCZ.` */U#wֻ|A2@C$qKkDEJ`x3O8L'H\F:7޾e @m ݺW0d <~M֫lĐ2qK8u)t+Ҷ)Jkr@jKm\=Ed;& rx9]9Tj];5 L8_LNJxj|o#V&PMI &ϕ@ARI5%QGuE52Ҝ& Z|quzB*RyC->] %ihhJKNlsߨw eȥљrX9`hbpO&ⱜ# vM|ooo(X>++YCMCO8$8)%y0,Ă0o^'ACTq'l딵IyA:f$>2'8@\&|9K`{b%bpb 2͑1KȸYf | U>Q3^ f>9y=xN)X xb G"jʍbG9K$3!Ak>R'F VDBd!a3[ߴ/)4QBsV WpwZ ,}L!@LpfNRڶZ '+,BK-)|{@5Rv rof@XeXaLu(prfZ_uM)./[d ] v֗TZZfjՙbC$.>o|n;fBs.b:7u|'y $q 8g|Sdu/~YS4n8`[<6c4?^u q,i6]Oonf-J{.eqQՖtHQ(iJ%25 L6c0=7qnjk{&^6NQG;UG=#E`c?%w9{@E`cܫ14=1-QMДȴKz8JT,W}k{ZJUh<3FAc0,P~)=<,E6>g p +lK(e]z;|@?EFۧ=O%/O&ؿB2uZ7vCƅOxݯfm}  AkmF .b` j7 jij\qW#)o7xatSߍ)=*´zٽ6CZf|r:8ţJ[fT$n-Y)E{6kY׬ͦY` O z5k\Uk=qZ:!=>Mj/2\]UfQd.nRϨqˍgingeWR%_K[5ۉ\Y{~ޜmLFS\T#=ZHi:MfoϿM3 ?f7\!7dax1/|,,LVh84y? 8Mn)hp&~~dgk|h7?joa&ߞwAjwXxewK )vWǂ@T"e 慰ߧۻKB3Pͪ.<_ Q MF7p4#C^Q ƥ]`'$5%6%0C p5X 3}҈ i=:i 9ͳz!HrU= Â+t׸qk9/t {?Q캟bZS7!_y_?4]}+l1-=lߓd˱̈jǞ$t$#Znl3EFi-S8YтQm0Pw4AS8*$n5lOwb@!T'Bx:P8Nln-^>.DiQIaXniMa|V79oFwtX` ÌfҼ],P ^~,KaЕs;[=[_i^zX`~`"hCq!USϛS)zڅ[֝9حu^n;Uo_DF $7}wR@$TT|Zo>8H.[^Jؚ}8^>8q'רRolIPc%3,Tr|} a87-,9c~P~?ߵ% ³1uFX׃g`B}0FX@}=: }RcvN@$59~Mr2L<%hʲZ_5Ji:cmeR3M跁ME^Zr*7nVtPJs}X^T<ޤ#56eN[P`5.8vO(+x{~^\Ţ鋂BUrh{]_<(te FoOnX R6/jc@߉Jv۟)dcln# qsC.PJޟpy.fnjwzS(+–R-8QW]Ҿ_ֶQ8{),%'+˦^쁾-SԞFWjIM?@ъuAIV$-ZY*M9V.äG'[n.9Wu ?%h?pc Fc[w|ovR_\Qv4H܅&IrGZm<}YMsNZeىvѼh/IR{ͫnҍP.${ΓOkK9>˽sTsTsTsԶs`3}r2GWOM(3(Q-fiQWP=4}_^i|(0.PXNTZ!#*qC FGɬ&F.| ɹhB@0*&JL!mvVZ LhVnvBʅ>8*B ~FqfiU`kG]mT°=y|GW` 7ާrGJB\jamMAI{>xTi`JEfaAEUquݳA( xUݘQ Kv`VVCWt,'ȮRVȱEs2 VxO/Xuqz; ,N5xO:XxP  # H9W%dm~ڒr6.?eHr] A͞tڡ+A5z%SlWi6 m 0ن$?cb6_ }t"Kx uxot@9>OL(w_h)GljO9\/?]Mf4]80V!(8p@W'=l|h =*5~LjyG_UmTGW2xD߿UmRp,2ϫ_>5^ih~$-zꑛ֝Qzړu:N?R:C>v┩:W%5!Ϻ}^%>I%āhQ.fČ4 ThLʔe4$%qZEU%#OuQf!e@㭱1' "襌J21I ^q!PWRȸJu&UcjX]mc;Yt Wkmkg4zTWkrPx*S6qƽf,TV*zR@(̀Q uZoMWa( jßl\9c_N^Z /6r`$g}=n[&Rs,ȏ_HV%.CmLDCIc|4XHk@ilD45L;6hULm65s Iz2_^8㑅}(q@C A&ߠf}jQpw8M{fww hA$vW3./i]ȟq}DVO෻qCBϡ?9$p|m[- wNOQd[k:v Jc{\Y4AZ9vr<4n 8Dco'mT0KZ.u ks\Rr|TNvMHF YmB@_ڮ\Kc=mp0:*2&+AkP ֶjچJ鐼rkk1J?Z>e1ogE7ᷣ)IcUHV7KN6%S 磹hnTgme8<6xULF5'/@SFTmnUD|!=xRti#>ͦkf~5&}R7K:8iUٝ8^4UaP=G ~UF$ڴvt5DA3^yeѤZ,zêR*^C^ ?]AYA~hV؛#M,5vYܮ_ʏa(4w3SB7 7S8:=]h3ޅBŨ輦Ѩx1m4K-lnUx2hk1&2GϛX7D0$0XI8BH|iiy =\FrڎeS] L5ZA/exDs5&Mrf#A[ƀ6==/nx߂`n>p" 6Ow0 yuG*%U2X3>2TE9rv!p:Ӂ3RsZZ>`R1QQi^@;z9}:AqiH4+N4St)@"ydmȥhcGT>] l0 YtgM f)$ϒAm (ȁVdcB|0AdWb gl(t XLZ뻇 ȯ,fֳ7fU.ZB\ڸ6&N⥊B8n y s^&,1-OU@ v#4m>=`hUuW:̰"P9;h UH\Uq]~lcG`ߢO7֊o5߷Oڻ"sa}?Ei\4x,oͲɸo/&qVU}/>}f[s^m̚%fqζyx''_+!3>>J(bLՠt9FfMQ tB{^vUD>wD6NVP9,wvl_f|:qixY3ӴYI=hi'?ɿG6G8FqgXv bҁ ?У!H\Nx -n2}p=C7{!rF` azq+;H1Iayc,9gmS"|<\oȗ/%$k#`km!08xY2Q~Wl['N\t ؃a ؄v"'zi nuYE!% rXD)UAn<8%MQk7%7SY%\.- R7LB92Yijs;棢OT&6k\ۧ/ط7>W!N¥^P2[T/ݗ?ভ ۏҟ۟:[e#1妙6M9٫[+M`迳hϚC y9Y뇳?V~Wj7^LЉRUmWO[Lv|v"G^7J;t8ٯ #}vyѪֻ`x[ pY_&h','mQQk(w` PJ!4sh燋$ `8ŅUǟP_lrL_݄ԍTl 6ܙd:<ϣ 'ЀWbFz]rhS3[EqcOc}Mt^!O.h\7J2hF3k`)&TRq;*bO:.|rݐ'L_Yժ?ɾ&4gh Hx6v$9Xjqܧ ="<2}?ł66g5Y` 妿7 5S,VU@U$U#6vv-zAo y62pʱ#w׬I;ZDv=opC.qe )ve( ഁ,P,w@jwfY<|oMx"mCHlguN@eNP| 8]5%l_b'0E١euh 549&QZy,r`d`ryx 7&3s@͏f{p).!7);kW </n?Oc)!hy~,%jU׌j5^1ՉQ|lcXIб 1zz:3I (u(5PIZ? ]n:5DWI2IJ,6o< Y,|&3S/o|ΠGD8*a:F-ܘ$ XuGF݁Ҋr;9;5KR AK)SjQ9". \|ѽl&Ky0$ҧZjnϚ$Fglq}0Lj aEځ9eCrp|t6F3 UXnIo: ~2 Ȯ+eQW)k"0J$'HT@Oaa-~{/u5rgReNZa V;F&Z*_'B{4.‰VG@3񯈗:ߙO^6| nC`ȌYep" T,j+ wzlcGJݝ Eݝxe_@tgLO"|)bJ6hb d GI:kmuO:6H6ph~}{۟dc*gFZ1P4OF9\7c5-tᒇm ƞ>q3MϕLѤZW~LxZd~z.rf m~!d]v*UHfױN]m%)t~^>B {c8{kcrW'({vrڹ> pEBmUʔ`^`\/0Ip2ݙwZ' p;u,hZ25uѲMrР 6%S:JDHX0QYմDWrmNjRCreF΀.部ubyqcҢuO X7RY3V,1+z;0Ƭ&amYX95ꌉ߿Jhc0\'uJ/F8 +6%2xՁ/du|LaT۞fV7Žf6EW~;H bʹԎA$JQ$*ШFm< vbwb("NI׻W +`40DLL%+~c;rxх~K6?W MkU3 /M'~ Q#0'7S C"Q0la,0w8<8 UXwYۓ t)v>ܭi3Ox%#@);¼~He UIcesY7u Ij`]WXd]G'z)%ihV"&MauP/n_}ͩԳyqOL*(Hz<|,}+'u|qu^s9n ap#B?d. lfC bY^2K(*ek6ɤ zuU౅-as2(<΅^nGy[϶K76?~'_V/`Ot+GXGl #,*!iMU+ %7:ThrNUޤK#h浫ж`C GFKF$'J:Z&  i0|Qޕl#wD'/֛O <`ft5P;fl w6`f"Z}SUQJQݠaoGіRBY1Xf2"#=ۘ@CS@D%䁒MҰB[F 7 um!$j`6 JB㥾\L͚XvtYȣ˨{*A=IsݗV`Bf%Rf0)P騢k/7sxI_ZL?֥Do] @˳bg]}$Z^ DĞakΞ7a΃dRB& Qkd)mJ?6ő PWJÕ"Jp" ʡ4S_b] EP]=$k$PXg8iLxr~͛_Λђgz9XcNs$<RPc rչi1ƵgY! VdYMt-45 Q6H*FAR9~q[爜p 80\#BHu[$9eyW4gkz \ sLeQ s%%ֲQ0,WxųӿOBRR?|]Ax߽A tLB~0W.X ά *JM8%–DD2CXI-"LcU U[?P՜SEQj{wH4fZ**º%av@~waR(%4O' (9 tl5q̸H'0zh%hYtFe {i$Pq9,jrpۉ9\0`fTpƤ%_JBk䜣.ydHtAAM\+p`f93YZ>3")#p|?`LPpGAA7j|!jsЍ ƜH#8e)Vka,Pv4smuf5.୯.q2kA%$}!``lF&rbR -z%HWXw1[UUf0:*ܥHZ|k_g;YuUYZuU"ܭNw)sEǰQ#y& !%!Ks㔏Qd)}1dK'vl!+xӎ7O_oi@,,,lߊ85%oP)tf}r^:wsT&bbLв H3ipe3iܗ]̙5S$jf .;KZWssφ1DU/LRrm7J5R(D/J͍<&<01)0@hXp^jbĈ@8 [#q Z'8P>%TaņU MvK1hKJq,Nk2hq4 =,HFQ-{&26;# vTmvTm|q7Z,ҧڮW.ɀzG犰$D^]}]>~1|MXr,7<2׎󠜠(!J6JõZ(Nq+>RHɵw|3Oka,Cl%ˎe`6dT.3l)fv Nf.{^#]_` Akm(K֤ڽs8 79Ӷ{탍v7?ngσq/|fF{}ckC{ w77O;hXi}G_?Mw|=4vm{w톑 [_ Ëk[;9k\\pr?lZsl{׎q5 %3tuO|Mw;!<"ŭͼqҍI3tisn:}ܨn4~mZ~9=j977`cH>|sΡW'l9\b/'A_9-O'!$jAk"gyN[?]jbDTպ@+}~_?!t _mr /k[W G\յ?R;gN fÐ !iw4(oxkNt|p^.==6zZ'НAa#[jouzg_OE~j> ,C>tgx4\z|@{Lwd?owI~; %1\ ̇"Y*IL4e~XSSg 끞|e3\Uk&*Kk&*^UdUDU8}Yk+_C'xHSAri,b' 2,D#%/5E^x5O-Tnxl LYBHg(p-x UV( b)G&W AܘI8m#-&ZB7R9ī|uk,,J-Bg*|, x%Π!ҡ,`z@& c6mM 5k[4EuwY q(Д65(*"b(j,Fk<W2h.dy9fuDG]Y=ER3--]؀4KR 7ND8z]G1R`"UW58#M䎵=r H[l]r'+Q0dU.K \ZPrROM8e?ldU>jWMDM% J 3dQZD54dj)64˂֙ PLY3.6_f9Gb30r\#TAK,6_SШ)m4*c)m|ؾ!@Iө\Q/2txtxi@.k//Zk@/Hk@nZ1bI0-p1_lv ^+ʍfS$d]1↑TN8_%)混 #f>LfŞK / %i R:'\`}c,^2%G=}iUlҔO-UJ"դ)7 z2ir_.xiJsS؜\=H04" ʈ~2iR_.#/7MsNoZihƖdZMr);kcIS>JR#IL2۽] sҔE؍=k\b-{>c$*j.kn&T>DeZ /tpz ›3L;wZpkQ=wkk&9~=*ͻ~^8GҜiۉ )Ք3rh!tCe7bȕAFG1 tW$؞Sb0AI̴ _0hh/5nwnmI*fgiM_8 cs$y JeIhquHˢ(:$E"yU}OW%^jwOR閵\?Ng[GepagvCiήZ#RRdw˾<9*5@$h,IwR̎< GUj%D.S+JdYDVK i̓!'㙉თYB.rE8F!\o'+\ʬAM3xLnw/z^+ucͶr 8lbl_r50{v֐s;#%wH1^`r$v=۪न{ rOI[n|UE%sZ~on A˘$y`x8ZUL.75u2JaLdeg ZWbqom>˳qoqoq5fT{p\Pn_8倞,SZ:+Qwt|6gci]xH՚KNE:¸_Qf͆#ȴ٦!mCmBZ8֯6mHې!!>%[9UqF/f2ZZ$ CY%I|DG:v>ki@O ԮH߉Y! . { mV(lNK8TUV_J2B #ceWm,\ ]q6& `ȶm#FllhUOLm™LWPc`V1@1Q="Σ4[_W(:}\1&>d1e~;8=9G0T8j¹Y;lxa"Ş9~M<}<=m~i~S輒 ftb'oc}M;!Y4YtpXpq(tE Ab0lHS\u*uU2ɪriAئ |؂-Hт-HтrREG1 0E 2di^@ZijB҄1L1nn Wk)Ǚ M"Ǭf}j g2?=St7 X,#blk!s\\4j@g;,5TfJ ^,YG T#E<MA}=zO+Xrٻx&3ܮN.JX aP!iUTeLT.+m%\dZm6&zv}\gQva"")U:f(Gy#yې/knOiLm̞6iz W{3~`L``O#s޾K䋋)X`mFzYQcUT\~ZjL6x~6m``oگZlxx{S̼'RJ3zhgW6xFPF ^eh#;&O8{q8mNm [R,b!d,M̵ w!R0} !&r~D:4j$M$ 5j[|nK; k8y}~q.YtQ'Uc0Fvf#81-|?Ynᅗ7@Zkj͈-:xI &_˯/zݥ3|q Twkd={؉QT;,;WQX ={GG|uA}ц%ff=;a5H@hnxtx"݋fAheoFaqAX.,QIT@$*2 68Bu9h'}.~GpﰄA!{6ǭq\m_'oL;1s{UJ >B{:ir |SXeGFFQU頒+ C%b*;t Kzc[uZ8)/Z8[8[8,1̳~u.[9~8LJŔ"L K%Bdk-R[hOi̕'Se<9#k{k:QNX ,J)IOYHF~.ju1C8 4iUkOF?~Iu2;p`> 9"@ Ubjpˣ<тۂퟹ~=}|Ao8÷ uu+(m81 &;^nk4IERMLƳ2b9ye*!DbJ)Q mٓmWnsn"hcÑ^쪺Ojx`vaxNY^,߽" 7G80:#>zS2/'N\n6RyCðwW=?c8A>[9߅)_=7>>ğS\@}!PS?_/opwGio>s%CDe|j}5ICF.7\Y200d-)IPF `Xbdi' \2`-P,e$ӕ w,9\ 燉]-e8)'6g;_Wg<᏾#K : _gCREoSmp;ZvmZ kT3p} ߤW~S?'ͼ;/_?~Vh-]y*0?~xUUUdzG"w@NfdR?᤮EKHT KEcQ:SZwO1T^\>Ïwx; ;Hޱ٠5]sgxX&.:E_1ʯvΕi%0nvSL6?~yiwY0ř E}&U/! ҧdDSRQdrpbʃtDf`q>p}f[/&լUX]rAwFQKo뵈HJ`(}'^"K- Tx.L6xEKk̥ f7W bL#+rne{5$ 6RR*`]/ uI>a7HB_r@ȈcA跑hQK\J=5KRn8uA撽]OW=U]]ͥ|nluz1HQޖ7=MU9CK멎bA`>#r:j}V^ZRcH2Bd)p=9]3D ;ʎ|}qy͋] %Kt?^]oA&(B(WM"dT̄]52EZΣvJPa.+Irr{A:j-̂ Z 8DۇKe׎JD0r9xOTvil~SVm}>I! bIIpѬ &1Tc9|^3˓zkO>> &]ՅT}ǜ9dG1I!4XU" eҩ(^DV@=|'$ 7ONC=C T8fJ:{03_Y˟iV]!3\JwNZT먼6rU'/]q[T2 EJP%,vc6pwN);Ca;>;tf}09y .bA#@셶jS}Tʵ\R B/U)_8ߋni-> 99q2"^jS_Y_^/#?2 |O G>:`>1>?у\*_ c<3`i}%&!<}rҟ]^oR7T R{O~Glg|I|rr}:zCIeo >n\ܬ߯W+ rbY=JNӓ9i9k#fe@yPJsjJ8p7Y{dLSUbjDN3| *'`Ғ|hOaϋ!tF1<D &"-Lҩ1>P`N a Q9#m{rmB0|83q%@CeSQypb^;(EtNJ(}# b^OQ0__u,Se:pO\0ݫ-7Lr+NU4_JiT%R L G/c6RiIH.D r3`cȏ=C*OŗEGP(VEW#rolKd1FE"TKĬFW/F}LܵTOE!j+<57|8\4Xhp?T/M4E609zSYKzA2csnDƠI0uڟYG&>D&\;BS!v֠N+41bH.#=M%KFqUqq\"vNo-dM0|pn穕Ѵ8x7ؼ%4|؜jUosPo~$*= \0%W)P{1ܻ󘇶oT[blv.w޿iᳳ>uu.$bS}6.WC)`2,3(M+R B2G>3'UDpaЬ߽4Eô =C-Bx_)) PlƂ@@*2/K.Itva"*pv$F||-O`Ͱ46$&MFꍫ}TEJDY:>#s \ee-uby'keY6O/Pze-N`l_U ;$3RmyUNBNdz[ÓN@{ `~8П%S}h4ÀCItQð$eX:T!&VAJ`%%.9è;cUSIvWƒ|` DW L|.}WIlI R,@S _}8(VCζ \$)1@ .eɍ_ɉZR,=\E5璫"i ϳ@Z <@3dJ^bU1.[\Î2ʢΔHߓ`{^o) G z+Y6D5০>y 5,X@9HۇK;E(ԁ}49YW>("\V X;tN}GU9?}* I$ )ƂjN[e~45J;3kU}H6}F~D=Y,' 4mqvCȦ D OR@k^`;FqbZTosPP&FAyG&{_eX,HY0 B=$S,T2I)r1i|D=0~ڎ,#nx~>RJ#x/_˕ӳgy*5 ,xeシx³Zpޗf뉳UJ~ -gyHPy;9rrG()z;s&Ǥz*?p}.hVbQoTw!.z8=0N-n1Sv6d}q곳:JaG3M88AT̋w)?$彇n"ChoّSefnAvd\X== ?X|^T O4|/]|XGP"z]/7ݻ7 ^th}?eڞǾi|{~lwZy'|V~W|X|rJ07{MfX@0>Ɨ>|m\>5V%%GObLS ,[M+8ѿ]}$W[rt#ESc93_2W:?d'LkM?fjˍT8*ޕ6r$BelIyp x<ƴ yhQ-Rj{7J,,ڀ%uSYQ_DFFDĩivv>Gv|[T uHX@mA=IoA~89Wg>{|Y,egO|y~"nؿg+V k ؉2 $Scm 1Rp,A'pQUŀut/QK5$yc2k|5\t0" Xdŏ1Jo1PC82*A8E]ݻ$dݬfSK!ԙ\Z AToֵaqn0%wnts->|_N. ?x.dqȎd0O,ol, 2VR77u@T oXAz~\C̣D 7 mcMq 6 V]P+l)fN֮Hnx 95y/+> UF3N<<1:5,ͥإ(e wϧJT#IYImP47Pb SA53-V/eyljPD 醎ju"gv:Vߋv'hold5Oi$ `#l '$o΃ZӼ +fc||`bE q\E q\C+MWDBфk( g$$b(%g k<=%磇6~LO+kqwiAne|hqJ][1\JKqI^D-zEԢe-ZbY#% r\kI1&i r93'#m}P-)X8׷ʅueBJ62XLުX)H0FR+xoK,&8m\PLJ7;A^OfJޡ1IM",(pt2A^ [Sc*VA $+t^i +ZEJ{ E# ",qn(!S ),к!4Ef VI,[5'3v'b'VCN O I~3JHJ(>2pWZΐȗGс!$GިT[|{,R"}',K;.2rI-G%7\ZchۗZMjNd[dQ܅7C.w#娩R@PL n'j%P{s5vBAqbU6D,- 㾫+ ?.Z/oL[X -#,tbu'Snl%+.eV\'a` yNv1]v|[1++vF`k+SxU!=*ďW+] '⃽:nmtXrO 18ZLm˥ *] W+Ckvo ԊB8QYRH5i xYy?]O/o8ŲS9@uOWj?𩴽~rwʡ ~&y\66<C!qݎz̒]<7}˧Q3Ǔf1x6{K}zzZ%J-(7 JhoRl0\GHFgl2g7VgaLj3HR{pcͭdlN HC#}h5YIVDULiPʶP">T4klln#)H˅ýv0aӳNsqtsuk\XWeM(V^!Y *[%a* JV X)] "~uT:RRگUg-bѽllV։jFDB5L9'% a Zn5`n% FH,EtJHۘ{Y%{|,а(8{A#+^1IYvcR_80E'%|1] jw.t9i=? cyiwBڟ~9N]y|3nk%ӂQ ν*>vFI+wE|'v1y='cE?$1!1 cntīvlJCȗ >4YƆƝ2'2mOb&6r朋))(/Cǝ ̡RMb.r0U$>($iOWJ]S5Ǐ64nVd$&p^՗5̉ K}OTں' HҸ[5`Lp諉{b&C)Ddw 8ʡG󱖆!Dʧ{BsQ%Jͩq)mO@jn=eQh4 ,֜`ŁZ4@)x @֚? 2lⸯhe1`☛ `"7Ɲ1D dy"0,%,Kq|]8!Nj{[Τ1W+9y8''QB, ix}uAai8ϙ &|~p¿|MOfW1:=#m.  +r/ t1{Ywb`F(U<,!Pd%B(A9H1g< )y[|է{yAEn Fz ` $r Tz@@cxȭZI$jF}S.R<3( o=\YM5\s" Z*ϔG8TC ЄPbl-SR̻xQn~?*hOGڅtC!8R: f$Ĉ\ߏp$ ME}d5 ..pysc{ BF>K{Aw>SrL-T 6C[ElB睡Dxo D`7pztoJ< Dʎe~IH%o~t2$%a4&uo 9(l6!-uh6T6)$.|M,Ǐ{&ABMJJ%5 KZ`~TT ԣ ˪)ֆňU#A#ԆTɂq0$a}u[+v("ZS?5MTA[ ^cfkV܅mx Ì/]ZPp.(VGv8TkRtvD[g7ԭSoIZlopH\n9k K|Yu,wT ".dDC* 7G֘ 0ǸIIM•+@N)`+)\QB['a#_8XHP"VP$$UWd.EAnSs᳹}oLJ抈LsV1 Lެ&npk&pM+7^h3;>TAb?ITZIᬩH]Ywך0d$#Jp+g:C.\S"]' oz_,<;nzS|ק.]N~ZߜPJ'v=Z9Y]Ơ_ɴ?$)O+㘢HT\ѧVLulGMKpK]i\KD9;=f% u 2~u(`#1)ܧ 6l̘_ژ u@ @"__OA;8J1[{rSE)RZzykfyۏ?_>8;v7B3]]^v//ߟ]=4 4 4}?/GPpy 8aisѢjP*/fc8q8lj]#f5䐡I$-q+ 4UP%bQ9KFQkl% )$Z88ϯԚvL"c=\L}ՌZEhqTUBjn EE&CNJe3UZU]R9l_jU2?GC)1,QX[^< Rb5~/X :D)_,7{?\|K̷8 JiF$OU5̵]]ٻXɭ?-y-DB܅̯~ 36xq7ˎQۧ i(/eىz7aUXŚ%˥]uґQQ,DS;`S@8 Ah+gvk.Oٳ92sS|٠* g!z%d6s$@R('9 rQ.9#gte_"R#<$)[,c~>]-6mk+-?AhsJR;£kBGطidHmވPonGLa9x=G8B='MW~īѯ,N >[w}0.4wvqVǿhdص g=E~ogDX_Z,DkHƭ B ɪa($f*f(qlp%HIʬWYזӷXS4hS]AQDs_BFg+WCa-Z %C5/'NJwbd^%{\u>A8Urr0cI>3WG쒝͊`&nQZ Hݧdc 6Z+$lU{pg'ƽXA7ו3H1Pa(gUpE1*!\IFΒjLLt}c&2]9[b.݀1o#?&]Nf{O~~RۇjOp'K;럲.S)씎{(, ݋zq1] WU5GieuQ bJ+HTFZ\_|͍Y_ Jr:&$@TLD6Dn;NCh6<3# 8K}4QrR$$4u扇{ ۡK*NOq)9=Fx.VK4#?߄P ~I6~DL- YOYvV~#gw kED2/vjJ JS4y,u r,RRzX< rQ $0Ԁֆer<=ң-~Fz8">98 =6J[=a!J޸lW?@.z.QN({)!o|7ҟz!`*\xu%C'O)|cQn}Z F&MP㒸!]\%vqrVe#hm]Q9tav 刁Lbi,&QH*Vt)J8a'PɡT@TF`_^)"s_vX(/P 2ܙW2"BΤUslƤKH^nKk#ĜBQ fB_HOdd.weͶn fKI^ʟn%,g'awO1Edߎ1A B8c!kp r&CF[ .{7go`$Ä\~ Nd5#@"[i9Ө"&7hj9Qd g} up"ՐbՃ!N0$ǫĹYj(1h!%7V1Ő -t&"sMgp83AeY7 `PxI@@8/~Y6~DCGOJ}[FR۽ʗ݌I1lѵ5w" Z嘱,tu,q i ,"")R_N1#7xԑfbK@JѠROD42$zeH-!3KU]B#n9c ?zgVq!X# p?쓶V<[MĹIlq4x7Z@Xl}6Żm>j^ɻwY{`YoG߆nV*L;Z멛zcT/f5_;7n߀ItY-;WϺ"l~7^εFBXP7W:}t~KM_{ZτgZ{8_!EvQK~dĆIB=L(!);}OuSEIvKM9Όs=x{>] YA,AU<; ȤD|2JE)nqL UPL,g"V.4\sg<fSf㰶\ҸLiqz$ESk:~W=^@7ޭ*bЗov@|4wA@L6Ǔw8?W#~{o?qN44V:6M7[Ug&D-523+8jkor>UF]+-)9ƻMwzآZoB/r(㋔kӯ=~1<"v<# hy~)I]\>pFGEǛ'-4ġXzUOhgr~v Ӡ-ݗǾ)̽mxL B[3  3w1zjaTW-ޅjfEk}jo_^1v|}/K tJl~V&U+κݣw=>5GKCOG9a~{GIQ |LVi5wϧQ0|i ^Ao=lm'OPLoֿ@zx<}Ep/R[P8P o?CO/ӣe؀ՇYh꣔xtF>h:x}k>?X}Vרi?^%VՈoZ~R=_5YjX\s#+h0Քԕ:jeCMRHsťNВVx_IJKKp 9tp}ZӃp AΟaaIydic].Cb_p>^4":⭼nx- \^o^R(pvHZ{O7Aߓ{Yc}Wu6g=[U{PtQ|SX | ͣPч8Y=˿g_'cvY=>!mhZ~/"g,,:*})vwhkoR&pPhΪ#>.p'A$Lώ~'dLoL+-PMf=-c2ᓣ 9 Ӕ(-(tInk:k.@4gpxs|9f8^iׯɰKu_ԯ(8ŝ%>!/yn݃ʾ| 6b-C=H1cdА֔%n{ t)C*D"pC-Ppf-a:)1>jѴr ftGseA׷ȌT~'VBv~Z8$80+%64l8@1<-Mem(nSh aG nQ>ql$ee.AY7X *٬2, kUR>, uK!l*py)VG] Rڢjk,+ħ8}K6e}G&Fc˵&1Hr\^TVHQRl\MH5kow lH"MtmLJ\<}@L9t‡-=&`Эog,^?&G7f wKh' ԱazA.3ox'7 ADŕ7 Jnh~r@ 0<;i|9KJR+LqWv+dY{#kntgw^(6.`ꮅ q̮.Exu7>%3T*QNیB;+SWd&pyyUD{78wP]|?MM+Nv<$%WTA+ϝDPUbjgwG(}F`_Vgf uRr^T\{ldA JZR2v>LK٧3뤏.ht:8x=uE&scZaAO<ՊzoɴtV?1d e~5J֨*YdQ7C1\eRP{o+Iپ}75_ {~%M:[[F TɕM/Ȕe e$;=zo+ŃQTu6l2] 3 :OaMdi4)v*(cJμ^eȺ%T."xJ`*tr (zfj0]]%l0WQYcro31F'[$Îd郚d뫒d#ȹa$u%Hv9fLe6tMaTb]JhTADP' [L">i*4cTrt2CMYЊDa TdwM%;.Xi'"e08:y25\&T4[C#J-2Ob/wZJjXQJZ^dD/gQX?Eç7[@wCp Ν:ÿ$AeZkxgG)2\> CT2"㒃}JpuBߜB;d4F`&*.Bpj'4^hQ E 'k#e[ɹmJ ;CN(]b-l0Ʉ()Ze+Iɓ#hOxAIFTv Uԑ$WY1c\`QB,=}vy{d,=z _˞@K- 2naO-Ru@yT|+[59sph"KìL1 ˆuuYdoq(%=H6>™T)) -rNT"W+y)feow6B;v?OBףL?o~?cz%x*9N)+ΐU'2-w1 zWiC#syn\ijйBb]milM%-45ݣ&vI[+/#knLSYe%3I:["0 _NK&ʖ7rsbf&ʜ{eVIž9=;>H˦:|6:U%l~Y+?\YEetR_ᅵWL-f9 (yN7?ku֩*#`z/v~Hz ){Ŵ_'7.!c7,lpD$ ]GxQ6jfP5?{Wȑ Yrـ^c0;czBm:ERȢXŪH5VTVqeDrdL^%XRzhGZ}Z5qdϱf)6@_ )I *9xy|}8Fyv6zX:. 'ioaaȟM:3UfJ٣/.IȩKe]tz: }MR^_LyQz,up]+nِn{R9P+TQ|AVA1DxUwdW4vшdz>CIL;>[(GKăU׶&!h}?|4!;|Oqd6 Пn~ZmIݽNJ *+\8ȹ|2N2dց%2㝕{őNԥzYCAKz |e-wtP>(႒ڄL N3$ywqT(r6Y*rG^gON&#V%O¥`2=NfsO{ȍJ :xK0&MKLơtN%aI;Kb?YJ:j3$GKTQ9Xn0I h6Xrr~J X$;.#9}L˕<ñrTPiڴׂ)Uk9 I68޺Eui^ KP,ncoՉP"Xqtt8֓"묙" m<T^OmO'#6ˣ$/KiHqoݢ7&W7\+)ǼZ?AAWH rl3&]i&L̖RЉJڵ8ް6Q_M`q Za< QH+r.8gܟl'11X&f;K#,W3.e}KW.KgOpr~Nn _3+ d׊)ʭizso#9 r+i!??ɞm6+3tNKzqe|vlRۋ4~*9˧l;4TP+[ @/2^ TcKiM]{kVAxȀQeF㢤&yXnx"m*Ί_!Y%!ˌv|s}QHO#ȶZ%]A]z pptr%+EBu1)Kܻ^OFsĪ,O3ϖ {,[ r` GRl:rIɝbs$2Q~.y`YuALQ${/µ1L*s-VDӾVLj]brRUkvDV19i-5%&l+S@28DZ"QЭ[S vr撶V~N.8X1snK oD}$OHAIj'6B6[ Z:MmK: IբD0k`)zc,[*І.jH ͢&% jkDFpGQSV4$kEҴ -3@[YI-/Danݢ,:2ֺHG/n- 5pQr0{<0krov#V%[Y/NtN&'ߪ idd` iKB{H:*66L>C*tC'9 |cGܶBT{d *j0t6"S:vN}r9<:a&c '5}DHO(+-l%K(`2ʰde"#x'pđ·M %Uz:OX9WR"Z- Q[!h4XZ&32&{…}ܒe^Se2yvs_>&#a)ݏm6~ܒm vCga*rE[nўVKQ0G%\JSյ)wXy-j1%xP[K ۺK^pnYrgHs`Fl7C47 k9_5a'cE0b3(R%kMj5z @cS]R*U͌T+]$m7uJɪWd^UT1z~|[m݊TQ* k>u[^0k͹-h" r}B0ɪI>!I+UHΧbk$^X@XVlJڒ7lj%GܾoڝLˤI?{ײg^;(@Y0gi@K˻Rz-m!ޙ y_QR >Hx>vgiHٗD<䟣::H,HY$bDf0diHZ)SVtZȔt-"zW9OEx&y:Y5a5=`TXO 1F酓5Ec ޡI.eƫϦTWs6|Ϛ<,'!dn[~xoS!:BEK*KK1( <XdZ s2%B m 8ÔZO&lĚɄ 1Fxx\|\5:^69%ji&qpcOj*ɤՏ?֛ǟ4uQ&v5;$]d`bZb 4&"\3uI0\3\j(4)9y?ےF Wo:92kȔLn1O M߁PY }|v0. k !+_&_p"_M; 012+p99ƫvV%4yNo5w1'=3p(uJ`pFDQi(!FDqB)u(@[ߙ+)}Mqt)'-*_N wOmq17y?Eo/߿&7L97ާóK!`N)١˼)܎ V[> )F+7P*s)l 9(;XK=}ՂV=!Q"1gŴ!c ifW,~hнpB+q ~ eiQ ÎRT#l3!PyXkt|Mg. f}IpX<9|RX WXPY`}7r֓)¾`9 kg<#$'r˦ e.,Ju u_)˸Cl?N#:u2D¥_-HrCnhŵ=O#Fju:w5' jN9:[+M8i%`+_u ḱ$2^Q Q+əxau)|*\3R]kTC݆/*(LCZ ]ktIS,hڦGHQ>Btr.K9#DTEG~L&?ܹFml?xqffP$(6F~E.#K߾[ƂdK-f.LA,8ɴR!4ɤT%wX8ի]04 ڏn4Ug V IU`%BS!?O$Q1_a>i9lguGt3]?|Lo`ToC W9~2qf1{qY\ֹ,u.e:hKݎm0 bV)0/N)WF)Z[nO(E!"xy_ax3'OyY>}YF䈎Ot&QF_ꙣiUMb,ƌDBK _^7/ r\,ƉN=)J)؄)Fe D61(13Z cl,]I>|w1naW~eظNס*rj~ͷ!eI_eI_.[Nl WLt./ӘYI@.&QLE"ۭ:mY"1VkvęVmܐmf`^oAyۦ23ek=EZ7eWɕYr|ye\e\r`YE3b oIJX BBOFk>3m{镦ʬ:)Զ36]Fnx:Z`ojlg0d^n| ~=Aݽ>;?M?0cX=[7 n?rsJ&2G?\}0yN *gߌf}?vW蛻sTIr{1E{f\F.->;~>Fm6%.f!jEތnncb5K%#+dzen WxQc0pLihUk싐_?YLn&RWVg:2e0YS:1T{ɒ7A d>ϓ[B:@@ ($8a"];{ oΊJ~ݤgѯ;G:ϜGƮ;g]wF\wmmYzYζT?26qyI %2$OUZVdL-u;ת:KRz祠gerc/6"ۉDJk#ZB#=F// =+yh9v?NmW@`ZX-3KGE3x4B ʄ+э4a"cVe^ Pd~6YC:%Vt/% =JQv%Z)k)hf%DKU3+~%?K%e[j5S@Re9 :)O:OKRn%Rap< /%R!!f(s㠚C65YLݎMM> ;L5ƈ{XJ9U m+ady4 aTY^gEimIG4eVpXk 唇m| 3 ܑ*Wn:,j̧,Doݼ-ftw~k)å}wox #UHH+$$z;~}bE5oAއ\8.R~bOWwj:zOS3Z`]x>`g?wo># J]|> ice9D:8XEfQe -C }FRo!CV,SfKaæxrobٟL*?\9,Cnzj*MLt &t趷>wS|8s<^_Lfoz{;~gԄ~h{wEȳ9/^-;oέr/rǘS'*܎ܴU맳U2•=QַQy槑e6,K);/c|Zzm~@?ufVSr/\DdJp֍x -u;`+=Tք|"L} v&kBȵGH$**K΃zYS\O5RB(g]*"S>٧_$'H2Vr`Lbȅ}X5/c.%L$] jAxpJ"6 XNϘ Z[xp\RFg ~,!Nx ,k FªDq&maj0t0dž zf ~| i)1'8gur?ti~8QDw#DXMn5ꋏ\Ib>v,|q4=(iM` ՟;B@@ M1U).9rx{Sp,9;e :K1DYw/8QO/6EsI֣hG)9}GӔARHruxZzNSLjNShL Npn,ܛ['oti٧T| `]؍<<9#+co| *G6;|snUƭI70Z gNǺ,늎ӺXW>~F6W霗pxLp)f.jMSnQV/}iart:Nd9QpN‰ [WFO`xt2}:h݂Jm.9k򉄜bL|$C_w Br(\s 2 $PL^FueA"?0Q C7(gx׽T@(I2` 5k=[C}a8ƼV1uRq/p Czۖ0eC^Cr2NǙ?ˀf!D Ӧ1 xKi4o;"zDA*ըf|߿Uw᭥)!<%ak{̻wY̆IvğRl޻Y/,7wa-kgIӚl8:Yߏ MCc#(6I}!$J5Sg.=H{rXP@nvʇ~u) ]{V X#7 WoB[c@<V#RouBYL&O]5+)#Нظ\LFG-ضagy4 dHͺ'B:4zFs,p$ZDv:@)NW1 ֫Ĕ u\Һ5lFh"xi6ɽکOI\Z׹TAq9&pNavKA:nv&ydK<;G %aBìڽbڗU`3yv>؋g|7OD[|}H[V|ou s>T؂>xg3VVTо|fk΋xE;α>BWz^qY[ wYיp 4U"=vEY9]^z vWkzU~W" "F+ïHT*ꀄ:s^ GZ K~HA8Ӱ1;f;QVP?@-I~oE0{; Gq8_@]6 ~݋ش9!6'.gXʠ saw>~ZmF f8ۇSvp,3{P*7S*z>άTaC/. F5ovs!byRKecaX">XIYx F@ ZPa=kRa0a:pټ/k^ϯz]z-n#,57Yl`Ooኛ0JN!+֟i),60{3t*~?&[<؋l]5[aIc_=yA}Scp18ke3[!,kByg3{~ oZOοU A>Y tR^!sVT$׈\|<i#<_`q ҈Ơ$+jo10hxwt>lJ3v1D.J$]1͵|4 ^r,E/zd2w. aLq0`Jskk-^R[ syyy}oy+)S FiC8Ċ0 tRÅHhf;dh.sk2RVw+ mŘZy7Z0W܏m珏mhn{ [{>ᐵ`dyf0'{۲ Wz^P@vkrĆjU7by"#C[uhnP(r?lhXLs*0"[Fb\xisZ8UaE X0 bu)2MB`#BH( z_XtA463p*,"kq/e$`p O'VIJa/F1> Co7a:Cq]jIЂM@CH$hV&a6?޵0dw^ɟ>tO0!k_xTDyGe݆VE?S^V?[ ɏOM^,'/,+Dk縲!$A 3zuN78/?yXtT[|#{`CH16 /`) "Fvֿ R#:I6q/.;y%*Njmzz;#H/[8|Q멦+%=3@W\pAQ;Y%hw NA|~ ^~:0"g;I2pHs `_?!KYr*89/jxkn0(ܣ-~%l1Yd@ } dNlZƜ @6_bJ]pәObǽ~ǽ~W2>(4B2C@hX`d0>rj;x||"lc=!I q 5qˇWǓ~< Ǔ~< WO()^Ѡ5Zc9P;Ή+mL&J;P:L>PR} 1ڿb6c܏as?j\c`#=S-]'[!@ %!s2b Bh,ƹeqqtXқ?/Yv52c@0'1dP' B{u0xSּS]>bhYm: ^&&?]Y7+2W =Z]9${e%q/lք&Go`QMYaIMH| $<>5 Wc"IAR'Ǔص>AϒϲmnR >gp'N=pbZ[6l$53N{t ŹA6A=Rѭa~:gTɃn L,1ƙq23A.lgupՀr[4IoLyKOLahYT=%1k-Z5>nwvI:x E9 g/jvyWeD] ᭗}^B"[\*o\|F觙]Iy̖v٦!Y\B2S?q,څg6Eei qӢշ|倚^\e ʀ/o?*IlFmGF8RwBw}|ڏoGӈ|L7~DDh?#?c4|Sٿټ;#t;z֟TOo鰚ͿI[hmTYQB ^4-PsZOp\1 ~כFߴ"ߠ5NK-Sa Hi1鰲yGDPE[jRV=7V=SRj?5p^{*).RlJ@AʃNHAs#D 䜫ȖxQpaŵ0p/]`\`M. @Kiǵ ?X"?5sѪey {8e;[7UЋVJ8"2ԝ#~oTAys5 T()Pmo/E] SQy%YN`Fф^vfH*0/2@l>r5|MPW(ߴen4JA%Aƾn/rQXk$ɬVǻ268lđ<}: MjWDXV$b<ϙ& ʩDy-h僉Dcts⦸$K"=C -qVDEKdF׮s0DXs}fѲG ax8aKAHbˎ1ΚIXx,6Xӳ=]mc \đ|Č.ritzBi bG4+D@&;`1dBӐ#'\3]V(դ s)rBSr?SO]K&H } >#IHgg`JbIm3y7Ί&'Z3FFtMGX0mۯuZ[)Î1lࢫ2ߗ=kq/Iۢ!=-!H_j./CCH(L%{1<1J VJIvinڼ z6F҈IjKV!vKc;h_Leu8@?GRʶŷa'ɶ3\|`q 5?^G.${` \"Lǁ*D%#3K`1dh&r ſ8ƴ i$XlSYɇ<6$dr >Nf?}oS%aԎ5@ TG<^(jV{U \`I.T:v: QB ,P :^ X@԰p2mטVV. @`4HlL촷["HzG4Å&g|#ւ |(Kw:?O<]qԧg~߫fh]NPx=&ppt~=FR˳m)0M>͓GTriU;OE@bXcFiX BRPBi8/Md^FhrNUV 68%敌MVORՃ50+%Zi4GcT!X,AkDHvVBȳҵ#d)QT+K)Rf n(kD' C cރXX` {GaN(QP5Y#}Ky 8Jctҩ!c prZOrlW2{5e,4gޣNG=6A)ѡ ؇4=y0gI +7qtkIҨ7rn5h{OZƻ^O OZNmw$w۱NmdkOCr-sU}X:Fzg87}ljWjg{v2rQ.Ő6[GY_KYD!iǕviBC,.T5wV[˧l8oU$.AhN1;={u N .: &'b#~}?ɹͺ_e7)$>h~ZMqwI5%-eWWѴĪrO[[چQx15Ï?ӬLFnݴ_7H#Jy'^l 7QNx 3\j֯EXg0dsϥvNi '[Y՛s|&FC.sSܱMxU|MmȤ@RWo8}ڸ"xkh*p̕rܰ|5,!UK kI|12GaE[W ωkπXD`\7fTD/.r8oh67qO_,ux5!.;\\*urP?8mr$͵"'(\1!P+'e}Mǻ*}9SRTr&S9d/MZ/=}[qiq鋸0ºd5R+!9Gf9G.yVO|kģ T0KA)g릖5R:a][~E|ʿYf1C1bfd^̇~7I^>~ Q)ޢc?^6=LRP%Tْw~J4C;ZV3u n=VIǂI(ZaD@2 Jkάbogi5gz:[<\и\{v+VR% yzU(ࣟF ëM px2on`!Oq^zY~禘L2O9V}$pm0R >"&IW Bo;]⸥SB@`-,͵NjϦx"5XL"y޽ >Y:RT2wL &ܓLjw ѣgA1+:Fu yI7ZZ0zϢTʟi31>*5o9g7կ'75B,c y D(Nnq2-9%_9 ݅k>}n,NA[WjP` :Qk0-IBj^V3$ +e9^K2kݻNF[_jpTbP=^WLuB|;mqT&rT^#d%p~uä"" f\nz-pb<*)YblZ5=T6Y$nE5렬k:l()iCF jBWd?aDHb*_'s%h:O,vԠ`Z i Uim7;d# "[ &pcdZ8J FxGC )a$a})>Bg>Q&ˈltɓ>j<'ǐP*,ǒYbgBjgâ s0/ @Nĥ XʂkVakLN!tou&%mԧ5ǯ/ldFj- $ K)'Qt AC1 (!0"FK$VDCcV0 [( ^ , +PVj't\kHNjoJLQ/m8G%KEP@"+e!p'чВ=U0kA@(k}-;Ԭ.$)Mknh|>h6eIlH+! rJ sCad7AE5TNɢl?%) X"ݟҠ?%?e$bf':1hDsu @S8Uo3psטX`>wrL{W 6=0L77tug|6%t\*wWC$jÈPՊ7_S;~?kGS{?o^S]w_VMPWB}IFѮF;݉0 v=+~'N]kZR♥f7ML+Bi%QVr6 4LVm_v%PW&+CM.rV.?~~AV2ZLԉ(k89-{N:j֮CO 0-_ĀM 'l ^hI /(jǝƽǝa[ns!Dj_,39fֺbijp…ET/f\?/KQyu)ӞF28.o ͈y'7WW1|q?p&ڇ k5Lӗ}6zh{(Nf͕N!΢T >/dx"}RjF[8[ZKeuoq hO۟^!TBI6β,F >] /Ǧ jL%/X g)1Ŝb̊4igAucAGDZqpl2X(Dp:^0Na}W" 4M rxmw1/QNOeYO] @C@ٿ%雘o)zJbm/.6 ?},Msr"2';O:,n3Q^&rA.WJ+( (~PrJ~ ͡Z})46N8dG239/;q˗C>Iblǀg'9_,)" :NOD #fM[ p ).$S.s&e9L)A(Y$wAPNsDr9 + IsA;'&I =5U@s^XbM"ojX5+KΗLww>?bொ*oo c'$ 3h c&dBeuBp'UFz6&_z7auփ@iE)!2,G^)]j4 n5w/>Oj:}P/ױK}"f.kş-B1Jiv9}5^d1%tCQZvxqFa ٫u] [VևW/?c Pd\g9DhQtDgIP nӐ<h8%/ >)PF 'ۮo31_ -hwU݂f |-hNߛ[?=;3Oc4C8jeȴf/xwԁ- \~X. qn\HjÞ|'"1E:gГpRd.'g;.R[-~v߼~:k!E&=\^Tғ9ʉYd iD'A$5bߍOt+tbUhV5X(("SUĸ)2I'iP̷!WHˏ>8艪YqҞƂ˒|h4#2|nJN3$i|ɩUݒiJo&r&Q-ƨ:OA9@2GQ0Ƒ$/CM!9j^7E~DMd ;[:z.) )s 2k>9Z\4)+90KEߵRhf@Cyc(i**ZIYhO:CR3/I6Y- J71VJzBoIMi_3DƐQD&JUTIc.eAm 1H F'S& VKGd qDhp\ *jf.8@`>an5g5f:nD`>s˵BD]䄓'-T^ 9PܑnHG}}k2+~$!ΉE ^i=yI)9Y3v8@$e/GJs#O{ʔPQ}oxIP# z󕎻/8tQċ㥥=z?=5E)Ks%d7+"rU1N@}~/ ޟ^n^_C.yt"i@%Ra=?CU*W_GZ2ZeHLQtQ<~K^6D!xo,$qu&ġOlk5V `Ka!|~Vu/Ȩ>Jc㻸%ltD.q9LǷ./]2Q?S) ­_8fÎ4_Ec,jrӉX rr.9~ rRp-49F[6 >aܡfca `ħ3ﶎy̼ga e^?ȴ׳3k=s0 2*r03IJ~,8ʙ8Z3lc'WLJeGN58ڌ*+3JMJӮ:MGsIXyPn5 Xx%*dW"Ԓ]k LٕR\[0%тkE,)nEdsVꋅց.Z?Y+iq)^.&n;4wFEt2,VI0\Z<)'傃4)Rs*(*B4aCT=p7Jt,or8^ 1g5$gMqjMNP 6Qv2=E[oeW@UԆig7Z]QB렁VH :f2zBP5K͑$c5VCň1)g)T$ $6z2LzRh| $ڌVA3-#ƅs7˴1NTk5dxN'2?Yzc@7)6Pcrx gUnf+ :-xMcƉk_iCt֌':OCm 5$;&=Dl|IQ :Oocvd!MYT&O m0VZ~WDqզa9g+y|*M5#b#eSG'$ H3etL(8 C XNkau3أ㞗rnE:6H,OE'h~M49RHm')gv^q9:GH3ik5 _ռے\Ls2 \BvN"$vɀ:`VEG>Oa8D2yr[ F̤>KVP%bM9+)ږ7(ڲߑRYni(Jw  ܎yaIz];]L!6Psk"l | ラj$K-_21}=< kBrkAYQ *sH";5aɤQ-k"M63#k@)<%^B@(PY{$ラ ɴ-&m/a!Yk._|^Ds}_:Z(4( [*״| %҉⇋xgL>3@%x47{.MyynW?8n1au^֕?u+TfFYP'FғSLޥ+]$;&2jĎu#b֭5:|bYZ1$;ѓeJ)AVZOS f|SlΨ. 5E;2[ Luii$7r_Q2dt=QEB1Qz(/yVɻ%mѧk.mGoݬ Y9QyIQ8]W=.rn3jpO!NCC 'zur5cFˀ$yGu7\r@ٮa^$%Me@SKr`%Y+O2LJ\XW=|N{ca“`YXYٽĢuHAznMsRsca fu5G}?9 j;>1dc9m$A]^f'#b=^R<SVqSpS Ɯps$dRadJ{jgy "NOgjFvejIyu 3UA+lRQ)AN3#3DEF*;lHBE&66ĥDD0Xq[Z ۠.7HM6k6%FbR9Ħ@1+#3x ¡N 0Kt͉*(I\\٢R9s/UW%ȴ0@ٻq,WTҳ;KGj3U3]֕" Qǯ伦0%QHV*6^pqJ >]"alxKu J0Ì J%B 1(q&A$&|DvݦB _!!5?!R! ׈JAb۲ZSɿ} amZ>ZB?&dzεga_"~~[M>PFv7f9Mԧ|/ϧWNH*GLs$&]-5ʷkuܮׇ(Iԉ@ڣZ7Yُ֫ڷ h~SQN¢\ZrnRןTg_S*"iZoqXkXa2c]riHu?2~y<f Ĝ!wLc(i{RǒffZ z ,dt*+9c*PfsKxz~nΏ1n$ t8:i"BM>oo($Tvk~BʨqFW%Jh ǷM|(w?cIǨz)#11$ Pm1~Ϙ06 (ei/p=qSȤy ҞR];7:r}WEg.HMvsT|BH: ժPH?8a>g0JJuRft.h'`TIj9Rͣ lZhAw;Ӂ%ٞs,5Qgka!0}((r:X!ښ<'saN0ItQ b2pXC;k q)n2G4p*W'҅Bj.3MIQb>JE5s}WfA%1*Hj^!i[]>{@/#83dkxslYhaډE@KТŐ&\jJC#aOgDp]I^VׂOMU^$1SwE! z(KF9)<`D*%-)av֖r'tA^l1%N c tz8͙Ž1PÑF ^[[*!XZ r 6.PQ IZd93܂+J0ZiL8Mx(IAh`Ku%6 u2S4t=>UT8Nz]Qg嶀 4{X~R2۩m *91s|{`Σw*? LW^3ns R"Q('x6.a` G!{(V2lݽ~;CΕR[5X + !. E5r %+/uwցHK5R0E?-RiVp24a=XPtDqƉfrշʯlJl)"{sY^WV+$w.&ɮ)3hr's//^&bL뷳f>oBIolx!ZVq C5`lwGG^֨jqگ*.cr0䀉__eb67fz=n?xrθ% 0a\.y=?ZuʷNׇxt}^RܮǾt}w5czΟ@/no>]-&&!sz{.=*PmGr˩=nI7,%ClGwGgDu ,XS$5ΕDIb⏱P!Tf 寽ǐ\ʰ[A]_%*~;ApxǨ!.[ԫ呛u1q,Y.X|Xe̓qsF[hܲ9>j忌rսa9R"E.hzWyP\-^<mÇ"PMu 8}##l{8HcuBj3?㝠(b?)=84ey;Xht'I @7( $22!Louߎ O\==lF[smv7}wz[޲).Wi0hD9i`G-HˆA4~z!BM@!|S'Ņh 5 B,Vy[g߯|VY]}$w]wb>jrfI.*su8(gytBPxж4{Exg׋7޸|aqxX}wlo-tv^;'o&˽jWha:SPD#qziuowBcs hWAI\daA;42J'H(6g)slTkU Brapӄ]#ncK@Qt!nXX &z>*~}bX.*\q=c YːbfĨm@AM2ay;)e4td,U'ev>"j1nXmץw5(LUtxw{9"PKTmWB"%G؆e '#О[J"":+30)aYKXڜ2k*}Z=-50fOZqψy4B'j،6q{E=A.Vǂ&з&T&<ݧES> 7Ph y7dub<;xvy3CG^rOoԂBс#fR`$ɭmzoGR+~͉9&1fyk8?up.Fۥ KnfX瀇2:Sm%9 ޱYC8#'@Y=MVrrc<]_M/xAd1=ۧ|n1n|8FCR_-ai#5_={M"z6%$z4΋kW\⁎Ύ[~Iuqz2X"f.Y(фV\P?p.DԳgȁlj|u°G?i={K!x17eP'X6/ff: k clTkf:Sݒnj0c&\@;V!uSQƈ4"ҕGod/)7?bPx8QnL*WtXJU蠇7!G+uZf}paXjAeٵJ5OMaW w{ZuW?֦Ʊْ'}c&Xc,6I }1.|0m6[CItXAq^MkE1U(arLaKaH#c/I-zY)\`-DLZW׿0oYW;ڨxULo?f兀]QEsTΠsJmsg.%Q)[i)0* V^zwzUN=8*.n~J{}+-*&0!C2rp WJdET\z$'8dQO0$X^KL)TT I:g=̡n;@/"}˥+ˌ:c"N;H::)f2%t:kbJ?eE2l:n O d&muz(E%3LKXK R=t*:u)E s~n*攨(LՁj/q,qeH MF!Rsb5übJtzfaqYUc\(/߶:bkb5*~-ł[ y˞Z~'>]t_֮G}E0QYL-7*UgB6xJpu%;VPlXuRc])-8Q!,j<3wG;+UQ}had"";z/LJCJ3o) | ?Pyq.ڌ)be.>sԍ#8ҙnHjcr%I`%%d$E'v>?&sZP=Wd+,zTb,̘t} A(y^R)][oǒ+v*շa݇] } B,9G'~IJȡù TU]]]1@gXʹ ~%474h:KM)mcaM_ "|`y]1Q:ȕzSzrj$_](cYŻ]>6z՘'fuP݃`j'ATweixjK ԟVvcjɼ"(dDecU^l$![G$Ē$B'd 9"d//oPK7.F%GuQF(AE5 Ģ d~!)R!4,9Ἄ:]C,h!@ -Egl6-I1zQfqF`y`Ђ`$,Zx94YKa4#c8Yrg sq!@8{MbI "TuP.d2@JOhǚUMDm&$!ZR7gu 5949a+jY,@$mQى҂JL %xbjJh5KT:js4ej/)G1grU҂Z&ށ EFm^**+^j@wn[H pTZawUyLja]l%M]j:X рdh\xG͉Ai%"{)u!Ui ]mEix x5f(}zpHUE!,ýW1W:dO_,1~.I{n x_U-?UO_n#Xrys{Ə_>}-?oD{oco>ǗZMۚlYQsB+1l% e޾ŏ?4cǠlp}s!&Kh(#F򔵠v/Ko܅a;P`m҇ʰ@LE"{x%MFۖ>(}&;e ^}{B pzR퓰Кj<0-iB͗nQ |/FW> V6))]O% 4.=`TJN&> I&Ɵ>X1F~U@Sov0)ë'T :կخSu?LZv.v.~\_ޛO i;GǏ3:ѵ"9;,ktлUx[Xɚ{M/*7 QLxuEAIB8<1Flu;ˎ'qĄKPfa]?iZz{vM]rqpgdQGB Xumzâgϓ݆O rIr f9ygGN #>RHHp $4`~_ i&bː<ҧy"cH@J8Ϩܧ"=vVt.)+!$+sA$,`5H,IT h[GJ9yJ酟j6g7hr=W [ -<-:ʒt (: dZf$*7uALa;s !>ɘ>0y'^|v#v 6xvONk{Ch<=Z,|ztP8k§=hո6dpK)iuFmVPkEs=][F'5./S1J ag c^}Q aFvH0]evނ ^[B}E 0l,OGtڽ3ch0bTHx,. yd 0gU*: EJd!gFo3i/X|$ɜ%#:o(>ZY<}Tۂv^Tؾ EldwIA"{#x{I(';1\iE"e [/ܾ.F/<č#)nP SqFnoGmѼIHZ+'U t\Ƈ 󒦖yn-#왼ll%k?R Nչx_&"z6ܨϑxwMK!!3 E}9:KV v~m2ɂ6Qm:=x[ňLS86K\ZZPgE/)yU-_ 2e6i i;ۯ!޴4j7 S.Cy%Nē3;v3e.S+0^Гfu Jw#'5*nl2Βykq8@C~cؗG/wFodԒC^3M',΃GZD>嗔绖۹EeM$BKFU›4[󍓆B Dm9)cPLy`kbƢ+朻뜻u]˭9w+2FʹQNсf/3ޝ]D<Ѩ\@q!']h/;^4BxZ,|yuq9<1/'' F@a~lJLУ'EM qFTt-$L+8dp5_* 2ѨB*DSDcM5)JS$X;Aokb(-7ە1SNU.Rs}t W*&m@@$ FYDL@/,Y &^,:!ں1so7l:t9RBAj@1!A0Amx_RIIz+d@2B:dԵh4Z"- iTrtP:zH";U DdQI38[@dT Q{ T oxˑ?1恀Yun*\^oJ ID lKA6j3,țg[i `2]mw@EhɳhVlYȵܝgE $`ѣe$ eM[!Iˊl۸'A2핿ׂ\dcqWEs*N3g*c9r'Єc`3vpɑ'(zveN:ʌ 4":^l4aC.kJXB-+Ă 6•R؃@zkנ!qiYV|$)P߆&m~axą#~гgsUr=*+cfT`ۯqd ^7dE[ńm̖}{z9;ٽ)6;/l"dy6M/˳١dmЂ8p53k$ 1pМz{v&c\\Q{!$ZFOu )G F)='f fрɁ 1?<9gO &cdrEyWkX$p|Sn_!qv۽Kw!56.Xy3S %ʽ5}yjU-ߗI]_o>g.imj8vb߾&P{*:K\mzY;H0)i ,ɧīKq1粜ُˉB ڀtΙ6z@tؒF/H$($L-ŮTP&AœG5.8Չ:lƁmVĭnp{plG%"v7vOlrM̿/uo,cU0ek@,K- Hznz㜀1f:,Ca PpT)$DI$,0k zrZaBq%W;aө6/a:z61Y 55S%@j6dTTA)s1Deq z۔yo&z:~@CJh5˨iԀ. 9rû-꯶oUԲ7! _zuۆ aAOL:j l-k IwNFR.e@N-$zSyrR3/"Z׿}K,/D2oЛO\VЊ)tQU~6C^^ XpT ?eb{(.lpKfۣ4qY,ѝZ,Js;>/LH6a~cc,/F[L̛i{[g(y fXџ2+{<,a~_5oaCRx3ܬ&yhE:ˢjyXjS̡:*7T9FyV'&,ۧQ MR^Nl XZGIˁėlAi`$%nTdU|`b̦0]onV_fwעYO,kK`Ty|$n;Ao4L\ <0T_DBV&u\` RLŌ* FSf$KqI4͓< d՞:׋ovtmqibX%U]QȗuD7Yt_04rtV9>ܥet} /oub^n|XW {ߥ?O$??!W-P%| eg,29 v%pD'0`O`~È ~']  Lh®AhO8DpBp:C:4r8{fF^Mmy@Z.cl4l7vOEs{}_^ϊ&zڬoXo'zv{J2wVE/X-r b SOܫXנOڅX ]z ^YаMF ufοX`J`LJf"!8]&Hɜf)횜λ&|?t/|gw:YᘉWׅ!f*o#|UjVsN @&"'uP Zӛ=^Tq4(AO8<;uE AQS)Q$E##-JNczK-ђlK h";2}ú1z= h@>.Ҳ<ā~yA*4&Y|H޲j{LPgI"Ý̧~_"maF ,?o׷vJd1bZ`S9 2P!aS R-8i4ѤÜz*燀 X}k|?_q؋OE 1Zn?f5Ѡ[4g?ǥ1` !tDC^A0gtBN:LZDϵJF~zMNdȒsD$D8ϲ4F.$\\2QgoT!"z[>YRF#^֭; " ;}ֈNe&;jJN4E"7l5bA;%G?u8A0V6Avvm10lb>[Od-O?#䨡DSV>c Z$/b 2zN7ڦK1 !I|^ćjmYG>w[\ZɽTk %m3Mm;̨{ -)%QFRHR#bi&xFv{x ۊ7ֈ%f`b) !lk@8jGXx(QQlNNXosW~(y;F[(ϓM#u& !7)}ǝGkEf+^"Q lBХͨŞlSUC2ZCtm|_E)}sPDw6\Ng8w0r xc?ŚkիX/b`JV9_%5ȍ>r5n3 jʐ MC"F58 /` r{R,N 1)NŒVݣ!yu$tْA6H.0D\`zWsݎA}t`ѝB4Ri8<=uݨ1 ׇ OmTُv 4^dhi8 TƼ[päpMy :',$xMEQ^iYn{<գ_ݍWE+7\<#[;cwyCK@b|<F;/F+~'ztO?y[oyQf*ߍ_ ]fy]@ۗ>r\6TM!_GJLKVl)d|wW8nǓU]qB%0KR͏"Itf>"sJ*ĤNcÉ&VPhҫۻi PqAJjxQgO-MԢN*.vӉSAFHp .s +j~܇niRu'`%Kǹ=n *ڕC3n N=-,jɵK;qwDYUVdwҤN[v1yv;N>09]6{c %(dW:7yqD$O`,n|a5. - 8xu|p| D)E6~Vi'Py2*00>/jzy?$ڀ}iɷK=/ǿܯ[]Q\3lqѻ"~fW.5*8S='=%2{dt" QLD" k{3T Tou?n~նLY /e;]>Жnq$'S]3bM'4-0y<}Wt5ClaԿw|`pѠ VPg!JS9<;l- M0!x6;߰D-ϧAt,/ht\/Γpp}ys<~GCAN>y5&#k^O'l S'GDF 7Lm3Z:ܹcHH/ cTr"QRp B+'q@X6y]}fjzp!jp2uܰ9lG`Bq2tbB>{LAӉ >j0_|eHvhvԋ^) hI*q48B4a%Y"m(q&j>OINHQ "¨P>U"ݜOz_.CaVsA.APiDR#bJL1Su$޳q0> :EܬO|d龒P;0밺H2r_C  HQbd؈Tũ*i"dt`'Xte/X-,1}PNVE zP? "ҠoK&Ctj'!#\.&*ٌ_>$N>3 kU_`AZd5 Z‚sj_VG,oQܙ}Kk\?dH)HFt(*0*lj͙Tqnxmh6-(PzQe 6 C8C Tm&$ o !tTt.5l )L"8"HaF@)x%с1-\=Wsh\@XbsCakiT,ϖ!#j-w ihXk]t7|8?8r%*A)g=*٘ЇMkwmm'Kd+R]rxPMkC0m7@~>؅>N"L(tiům7-bӞe$,DX~yg(=cҲqN.>MO`-ٿ~'›\91Ox VBpqHi3b: ;Aƍv 76i9֜]~>* Sz'- RlopGz6@ѰՑ_|]|_4CR} .1ՖlDoSZKb/,9ѕ MY6S j(=>A+vu(W1󕪲\ s~cBG:DS*\`~}~V-%NZgx󯣑(4ͪ>/ ǁGQkC()(78nhQ+L2Z9^߆$+,*'GSgTss \X3.L@I]$%р-燃0'7.zڟt:~%w$b ~V~S ] ӄ^0*|Ύs^Gz~L0g#U3#Gÿ- U]HB}fVN׌|"Z$S~y݄GnN;h#JV ǵ݊'ݚ\D)-}qRgϧ`]֯5+x_+-֘"X'"i<2sORo:^ؽ2c6Lпn&)R6hfݲYiwzK"dz ݏ#?ݟ@bι  itjbOz")!%&bkPO:XQ[i6`Z̥W@u_|~ ZڦP, :v/ˍo.lnV 0'i\v%$$$ 4-EAʧn(w;YRp1}%6FZp͸snc hr Ä:|`sχnK@EO닪X =hJ_u{(tYtfbN`" "F9#O2 KZT[(]'HGBK%fqtr.N7G߬vJ2%^}> *=> suc#DGwz jy2-y%C z9K3}T߹^x [hKlJ /Ns I{ &Yq דw'a_ɜ_Ҧn_|0/ijrF^,}878=;2%:nDN2#'+dxX49}3zlz>/Njlr 73$ .۹8^,7_obwqՂDB{W1~&Ap]pg (M0DKSF& Q` ޕ-2.-=o |dd7ͳ"k8f̚gL=lRtQq Ɠ*ذt%'ɒ D MQZRNsI8UKq!1(gr9=/E"[1a\[1tͦ9Fu=v/-EP~JS 6Eh>)!Ekߍ)ȅK<2 N|.g5&LeDkTyfJ,X N ^>Αʎ4N1f.߱*bjm-fުc$djw>e MiŕM#'L M?+Zc׼RU?\agjaϧ| Y;T']2Q"EST)\r*xO oP1}$vcsQYya@^MmFk$jQ@KCz7}hEgJ|D%;gVsOzTB5n$rC. "9E(OT-DUQ՘|mnoRF Mhx:25gW09P. p_!$ų+K 5T?ɎFF,9ĥNⱄ_3)5Z&d."J JPSjL E- fe>^E˼yv22k~fw=<=ѝ{i螣J"QG¶Nc98iUHj\^:A*8.dNj3' ˟\ R;Htm+<&{ Fx{2fů>?0xze#ɢ:%G!1!ϛWZI@M d ?g\ HrˋJ؏ fvb\k.>27\4l:īᣟ]ݜ "b4 ~d:[sow 'W߀#3*Zb1\ۥ }y$|`@x;óĶ?O)AĒny{ qpܻ7wC5%%tC(7=ݨb"imL?G=uu{4[ۣy+cݵ!#,޵57ncݩ yH*ٙJO4[VKl;[l( ^,V%t7M|ppn8!Ch)x"?tȠmBVF.U/2N@y>l%C:;y1#>氼amj(ǜtxFK_wwՄ6!It[~wNӇ{ 8O/w/뜵W<&oG޼g}c,nMfpHިp:0qW#6 _ "`L ^rs0l=mIQIeʅ hƐ!wx)(W!!>ބB#?r:oz1BEe _dߩk5Gٜ]7$R;͜~݌u5y wV%E6[6_8A/ng㧯Juz郗6esf9<֣k57! )Uw7-n y;s m_O |-OȥB0c*_i[4qj h#.|46"(wkm-* - 7B@fk{ ZUL*q_ݦ]Blz_&wuSb]lPTif͸^v0ܒPC JJ.C ia3ȘrCS2WTu#ڮRCQE褆qhŇ´%]Ƚ EꮞRC#!)렼g\툪FTK펮{j\#bu5.\ЇUf"Pm#T;HI2خoH]mu@5t,s*Cc:Q6' qNpc;11⤭yPj(~ͭhZᧉ_C Zn?L9ݒ *P_9j{&l @ 0n$CcIpc5Rzl\A@"V@N-\] M`Z9zl4h%ߞa9c*jޓKnw@\q,Dl G59{Bd).ȗwiY?,ZV<.oݏ|[lJfil_9J,BӃG푕?>Xo:^S~ݿދ=/A@;\8Ky1ӷk9y2HYOVsq|LpkG˄`_!(I(ǝriP.-ESA/ GhPs%e =h}'/(OF\HnʄDE=| 8~@C'? DX l19k۔D8mA:E6Fv_˝x"dg Jn-$w)gE'^vs*.nY"iM*q$hp2dyXOX8ǻk"pT<܂"cCWdfN %%u,ʀed7/f|QL`#f%d|V̞on,QqyM#Zi8Ƹ PjDv&P yS39aJwwK!8P,t|p_>j m  d{>]h@ќT Eml"]R#L]F3:n~n33sj6>{"l#]m"C+o| zmu:PZgKl;_”#Mx[`PjYe Cz)4p +798Aa2)CQgc)b \O.o!o=1z}LDڣ9No.생Oi&ōJܯ˙ynnw72X14c MO"r)rw2^~ܸoޗo1!TX4E2pd"9KCLX}*R9x i5鞀6Q*,4M&I$y l*'P^e|vŅN;Vq),Qq"08,/f|c0&DBŭ6mD[,tF2AXV2n!(J˳{Έ6#y\!+n+q>-CJ7E;HHU /Vڤޮp܂9ث ghEJ($djRi<]Ѽe]v+ڑd bN'@$۲ kȰ!ty1' pL΍)Sk_Y3{ݵB@p~`[_l`ʱOϵ߽+5y3#a̟U#](R"x/6vbѣŸ:O9(UQ40@SsVsLsxd&HRʥK=6(Q$) i.s8VNL_=Ӄ>FL"2JiP+q"iffr5\qSO# P[54f.8lŗO_!g7UJ ;5%#愒1Iڼ13`׼hyYa(TL1$!VicQJ*y.҈ N$5;j '!>Tұ^6V5 51iMH.\k{U1 y/Dn+@By*`n䔥i",Ohi4;:1"tk9ΏiH+ظNj=x TmI]pT s-SoK3CaB( B Cs˹_y藋훋3kj;w؋o~6Å ng׋}qgl~}Q6 V\ oQW.ۀqWsQ~A/ngclj>+,1rT5IL 49DK[dDSf2r.2OZl3کR ."|I44l`+) r*RphwL9X9)ڜX0%bp NeJAuPRBMx{Gi#HixkI&é1 Mx( _Ȳs 8D= ;`BIfe?sn2Tɍ.i%T2rDH!Pm&.1vv{^dzLgP$Q/NG󪅨qV 6q$m}F>4F3J8\P )BPmBzR&6I2EhsA XevP{D {8A|꽰щE*<)=8/|cDu)]LW}wt|nDpj Ѓai c+68$[0Y&d;u Kn1YEddDdD$,nSs~/ooamiy`2 LԈ*b>Y;b} #`Ƙ/4wPQ^.t^d`]˔wq# PB*$-5μ@PL(BسњЫM32z F *Cyӥ͆3YTFR`qo3bpí `#L\RdC L`F%fbM䂗¸h k)l/Q R69%1\Lg.PBF.$3CHEԌA\ r Usxyjt ! {6u|f rϞsn=gEd"Z˂bJFg~EKOm Z[ɢEDZ!l,@pXO2Ⳡ uxK+;ZlDa>݄]BuwoKXiZnv (~ma$0 艿_Z͸ cKf T#8yqYuļ|ȷP~:k^DDh-֓šCw?}{6| (o`~q=A]Ε/>?on`|" `䯾=~W$X!A>:+ k1~7 lJPU2֮ u))X_'7ӫ)HӒW鯴x/Ф%ʼ$  \]vI? KdK%mR{",x+57H wZEyf%('bZa4劽@@EX1&m_t1Y]' WoI({_;'*/4]|L}]txݕqf_2R9kM53;w1!P %;nbvõOip:;sHŽJѷs[)ayO{өXFVb6*N`D OC#ڡk /S-e%LZʼngHE' S ;JԞEjepZz VH;Tڡھ`9]z(%5tպ[!M-Y]0Ia p2`f8_6j(pM`@&XierN#fjq,Ro8_ PØU.j?6{4Bd-ZTښmmcTN;$!; AƢI 2 :h ?Ap!BiE0>hTs( D:5Uj r2ÔV3ZDQ#"N M8qhG" SR2( ;%l\ Tـ,pL77 Fj&@ ^he|A:mgiuddGA%, H% Tu^ؔpv_$@;]$ #n2鎛Oۯ\i'նFS rTŬ!lgy iIc]Ra85|QCc`(|fQC`2g HLI&Anv%"q0.xX>HR)8@Bn Ot"fcև DS:}98rl G`=-$bLI&Į (= G!(Q!MX1+Y ʁdmLzdj5Ximy```k553ƂQHp*R8L r`1f.QDo[ lt] efOeSFƱ ;:Sp9\x?wtp ~Øf?UĮc:fPPyw>fʸX!pK60=)O7% =X E-.%m3Ku>f}dOޔ=cjP9dgsu#[p*xmeeְy[q$1y̛y0OQ!ȘWcH`XGx_pNb'gay t'tYz CFp?M/è8yc6:|~^|>)5? 1#QrFy CVtRK%Pw͛0x#o Yt)Z` B^EH,#V J8K=AK# uݶtme}O.1\~ +kCu?ԀTNi7FxG Zг1fqpjSe崈R(BD+aAyIgXt$ "K#ZdRaMl$тNr`J>6^ژ9 褪d6 A'N<*i"3n4z"$XkK `zUpH LR'mУHLXРnmlR,m08Q:se>) 捖 qqcɶ`(6+Jtj΂56Da[BPz;\""H0y3dҶOu:d%9Vzkdy$DQF`ZV')V7­C1Hmu e:t f7 P{++tLNKռL~&isHadf~ur^Ƞ 飠fXP[;T#m-]a0ü:_&JHXw*Y J0绮S9Yd?%usfUb O2 ^y&eYh)K ǚ!CU5&rK_io 7@fsGe3F4'n;T <; H'BT G7@5ԹS; cuY#(&蟬5u;nJKM麟8iy9^3j/J/w⟶fZ̀J3x3@P:X[AjPĔ<=xyF>,oN}X3U%go:G½!,xw.#ZW2KtcG@VW7vΣGV=<7;S=>l`_]/̓^8byCX^8K2:ӈ ޻8Ce烳O1i9n9$Q5| nc ιtN7&U\c՗-F ;Uw7VK |spE~٣ L7Kceٵ#DoJq)v\ ;aψP ~1J5IS;OcĔezixt?s~fsTHU>سPǂ)l$THG@#1x#S^,aH )TX˕ q #QYَ#C_gS~0-KEyǘuhDp'Wc OBuywfxb"ykLuJ0yߡ4l{h`;wnz(tЦ%i}#E =.?|8kYp5x9p֧WS Xj8س zZ8.õŔGYG0CO5p:bB XvxBKvQr\'c0-,I*(l8(L %PEaMvr94q4ipZdV\dA+jQMm2.Ovy3(]+ksc/%3VUeo*ҿl4<_+onٮ|wf- )U֓pySX߾y6,&f>7c΢1mƴ#߆%f0 I.3f]2#LovLz< C]>]V,<Q_!ƢZ3r 5#*'G^ \D{Uf;vЋ5X'Iw:.ot"T4cM8tb՚vG}w4UN߽yAERlyEY}8 p\N>sfLnM6ry -Q⩜e͞R3ݷZ!Xuɲl0]aq>y.6~~f"}znm=" rSU8<Ǽz;ܦNW{#c ` ^e3 {P!lm@p J;VTێu^3UPYL2WA+tdP^sƌ?4~N]gWoҩ E )X( ]qГUz_伣})E!< _ǔ}G#ͻYdqVȓ~_:iLeUӭ8eqY8 ;N6Sۇs13%{q Œ!s{v'> b>}bBH{LN昨S:XR')p]ҋ.:{a((B^1袁 pPݝ]O xx2\!jͿRث7v 'nFqOphs?׎eK} g/`D~.I$k٢(- o]3)cH~SBaeC_!V-_Lm*_aػ߶mwwR!Ou%h :Izf[d͂/)*ے/O<`%*XTwʢFGOte.FS.o`΋&\G U;XL.aРeD6 13nإ V'n VԂ8j"$gLI}XbHw$ѱ  ķM B8c FV&ބB*.:8M6ȘQ˫R` [)U'k9Bm[co@;anA*.G*p"-V:뀛8nƕ6 f(sz(%74W 0DJ|b X+SNiPO,虶qEve')p$" u\ɩM5y|N(u,!ox"=Q1r2;K*̧%_/Ҝw>0_-S3 A1}2T~-s@Ʒe5=79n*$XAe[/b6:5̫g5%*hC!*d,Cd BwDA"V:\dy´fg]c4{sH I7 Uesٛ~/o2@T L"+8[x\Y/K /hƢ.b `U 8.T5iٻm/vŲGIhbwPvWH*gBsoŮ}Xve-tPN󛦫uÏ+U0aa_z6^n#pg%~\6LsLyq{1n)RId̆Ix.䰋ra/,L-G b($cSZ˳Q~ى/.Ү20kݬ{Я9ԻIzI}5F~yܼEtMX2Q &]{QeJlA{O'ۛ$ ro*0 b L34S^xA$U+ ( `H/]ys-byD 1HijJA `t [t+`"Rucvq~ FJ/=L\WKs32sYAUR& |~Ά{b=𲥽߭Eq~40NI6'q>͈TRݛt!K|3wJ@es^+<o @Cqq p6=_?$`Hi /{D˟ := *_l2ׯaerj_^ؓf^Fj1F@u"cxԁ>ҮQXa|2~U'G羳3~5u!|7`ԇ4I..yeڷkk^۝6L&7ٍ7^8;ak@±1 [ػ> 1:&;B^q`=:y{u(cW֗CBɕ&ϡtȜK`jc)45%E[8olkۻSJs37|- ݨ T@7)H79\uTC`΍^JWvП|S#A'l>?cc^G^onkz[ A~ g )nzT:˴ntz>Ζ7Vm0&+!: ~b+.;`ԔNQqC>IF${ɗs"F9i2lqlQG,r|/Ce4d?M gÍ aδ^P ,ĭ16=}m2VcFR)G4+I\:Mn˨mT~ة^"O96i֑lT]q&{7<߽i'֯]|߽݇V_7>||}ٌF'lϘo?}??Ż?.zS3cR1i ?}_a]{ޅ%6˖mO=Y%˧71OO\F]d}Ȏч33uzVQ׻OCi߾5H=zSu+MSUҫj B^v=]>) ,,>VҲ"I6p.-Lꐺ3wW-3sig;EK$(F*|A- >tAՍ\+ْY23lY1MaIOaVoJk+(pɲ\E ӳ;W| ,u]-3NͶ]35әcgtʸ TNNkބ*]QϫFa.+'m_ļ%3qy.׆o]Oe $KMdǓ,$%x'slh7Axɰ|x5˾[;` -8`W:;jOSEO'يM_f,fˮLpi0Rp\TXBرJ8QqcRH2"Hmw@%σJo:lTi3K엱Z*0T:Mc魑Z)~[ZD9 m:ZAj'`M*I}&%oG4`@5l2RMN\)EQd*m2se} canSyc JTXiwtNEy{~{ 0͉LgM1b3p1/yA^rt7/U)TǍœ]i^HԊcupmLhӉyݻ˛H3zr@N ֕nyZQIpz olkb׌uҌZOr^Ʒp>qx) l_zɎGWLay](I;'^A.5Ήito+OAT,@*5{yy MT=MyH|WJSJj8 /axZ"*sS+A?j_Њw! EĘ[#sQÄ"z49soWAC^뚒E=.#3^0֯6gUϖJoq2 ۜdBaj_ӗDK4SԞ m;Q4<: M$A M EYa&^XUks9gK `"v2(BͅĈrJ4Hz` $>Z !.HBʀf`C=,tD򪀐܉3__urō8^.XBF6& Ւq*?<z;1k׺rH9y[tD9s4k7WmD)Ob 2#8pd0|mw͍3#T<摽fhu /lb5Xݦnm| =dDatr`U9iv#y,{#PYH:J6" Jd7QpEohUJxHBk#GIiOGLj$˴i;t{-Z1`;"tfV:4G-^QjwR :>&ucqz\f,sjv_HFK(&ZB-dj A\)C1sB/,-Ks e[tDgzH_!,rF[ŀN Xg9 6PN~ obg"-b%{^?M4i*ɣ0QW<4qj{l9G{sY 4?NS9HB;Q"BZD:]R>_.7q%ǑYkc=Ȥ v7#/FѷAuK2>JGid)v}|QV_7p$qZ7! ⧭'Ɛ@ hO+f(0jCﴟBg=g{)[tv楾Nq[e!;^>m@J1ii94.Sfm>ӠiسĆ/=!>9VIӻFܕ_Co5p"Rs¸nfFicB{!'P"DJ0*+K%"=1}dJI$8B*rl8L xɃE?IC\ SKDTVT!=#ʀܘO0ni\9SQ1 f4Ή8jvǮHr=9ӣp;7:2{FKw8,ni?嚱OojufEk?c2͆^u.M G|DJs67?M.'777wW3I2F$K!R._ p6Kl+׺j)!2H9Tx;h'BMf;gWjvz]]JCeiMrQ*ͯ?=\;Gδ]+dT5[LJ+O Ij+BXDIV?GbF+Ty ~:F9cPTåP(b6\HKƮE٢q>sʱ俯8m.c˨kGGf$y; 5U^0>FΝ¹7Shg#cG81~D\5NOЊZ|F |p܌/9bËq3h.W'ԄK/3rF$UD)u;&6ɢdpuq՜2'parHcttThS?1,J(2:@Mf zU0)$#&sSRwG׎:G;3C׏+|%`qJ@S?XȉU),e67X`"$b~t~t\?PPnǽdϣZhŇ,;Kb Jjuq3^ɥsz+z#E*UZV͠a^䔐fER3$-PJ 6h*/@HS^KPArL225pQRJsϩ`\S"d>S9'ڨ%uf>,jOۼ9{r*kUϩG~`~7(4ng׷0QrΨ:'T׹(gvJ\fIwˏ';-MW>Y5?ບTVDmF+ wjSBeϗIS~*NJ&*\~Re$6w}ܟ q(X>|^\}j6O׳ULD G=2e;SY,v\ F U0ВCϜ͙%%k t$u5aeAϾ`~׷b;U2{4vwcn{X3$G fD+BX V^; WÓzu`^$o=Afź֏W.)KEt^NK̆>uH2ش9E-%aT./ i"hD9Y衬ie;l+13VM2aV6DPuDPUT{J hSZ* e4tn&Cyaq m#Ӌ bx3/:` `kѺ𞽀 `ENgSBB$/0;';_<-E7FM]UF3hӵUP0^ilh>~BFy*Z鎲d]\$omtK:ZɓHuKTEZD&mb>Sl S0qq6_&B|C:A1G4S:{~~8f!:% N$`А`d!)I|Hy䄒ǬR'WТMWR!PJ+FC(ÄѢobıYFj-#.`n 9Ȃi@qz䙤<^hKU\0sMaƩԂ'8ФCAqD.I#&[<"eSI\ʨ"8 I.) ~ر$K>iHAyb! 51AHc(G7>p A;e˔:kϰyqUpA2 656`meG3fs˹~s6}L)RޜJbf"I07;X0!q4'i0Jy]i~g-Y4)IF)v|F 36#q`||Ѫ!u[HJ6>>P5!+>SF'?⸦Q ڄB7(ej\$~wgc!`~G޳8WvupSM#{&3\E0aWtWeӜ:'"λB5w}F:z= 2dY&;K56~it96w6ΝsgٸٜޭHHkYuWD]=S@P*SӇuDՂ M9 sR<5_ t/ҷe5Aq:ηRVy+ɯHE[Ў$Jd̔[<ј(raGM]($X9t׎:+v|gU湋q4c뀎w,Q 61&Z&+M% I.Qr=CŜ-j4C:O yTy=߆>W/8`YL`)jIwFD;(hqo*rVS'P܂2 }J!XN4qM1|SI,gib/?]c,͢KVJuI"fMLya0Eq|pVJ`dNimdp\Ө:%FVlOBϷ9u+TqfT=e)$H}K>'ߕ*tfqf|HP{4ˀn0y\S ߌ=**)[3=#)~<Q RuUm'Q_ {~o]6+FbwOC7qƟwp\RP=*rmo{诤(%ABzQ!uRq܂|q)P`mvhB'\vP3>m;Eu|/hGjed=za k @ Njz !ƳrC4:t&HHgTd\G"'xg9NbjKp/CJ+PLOޓWf$[[רB]\:WeB7#),] jm=U6#F !x|69RfN_$~ʦ]A0i4W5Gl3JG2O/͇`!W=8 i#<٫vb^ud߼*i]/ _5 =NK_2/ ^!4_W}e7r0@of < }.m u1@M2leRLU@_O\=hlVM[oN^~>YE~鼻ER;< 9ew>rw~0j+h?O->M<;j΀ͼ8(ʳnrshs 9W˫PmOLCۀ֤6󽛁 0\PFz\?JT?EKl] 6^@Jz,KL=A R: yt&`msJ9s|q8*|bJ 8s! v,Z&: BŗDs!N1W `//f27f.v~/G2Go,C)z|lc{6>*Ĩ΁γn  ep936ݠo/~6EAP`h!W=U\%ک/ӕ ,i}>BhTvH5Ouz테ǣjZŵ>nUq)~o0ǟG\on&.s%ÿ~mJX#Gԯ}H#I#c2'&wS`u{J&0؇nU/4Hc-v>$FJ*Lwd6գ@5J^ˡ;44\I'+cs {3$T㔍kiKP^8Jipr (e#eH WRXzyL_/}"lЄ$ $s(d*V)7&}3.m;Apuh% Z eo㙶Û=?E1D]8 d' r D )P;Dg>gC)yxTҁ <"I(AaT;q;TqMùZ*؅zxc_[vA8ޱskA ul֌4溤{,{uٜ q0-%{M4!:9s |T|6bCr)*H 4$gښ۸_aeO*;UVR[ǧ슝˦XXLHJ!)9Ti" )^0 mFh|P͹K %a֣B [amY/!;()]В.[.i14 Ey$RHě2ke)J !C['ÅWQoz"/1Le \@\5 \ ( ])6"-o [/62ڈT0}pUjn5Ƭ>ߕǶ[ZDCv7'[) QHɭ6 82DTm+(-k&¶*J &V1Umi>!yAʞB(`*j4W+8DƬVLjhn޷W-,Th=HfEա* *͞wsU k m k~Т㯛GBK. l?Hcbd3n~^Ba/Ba/_nx D TRǙ7kk)g0GJ-csZǴ͸K-—:8#+B_unYM[!MqaM4?ͦLg}v>>hJw~( 58X)^H5k}p7a;~q =aR2 f`¸M &̈ &ݨ[+7P&R:' wR+Ͻq ,2+4Q|Ʋ>\j^^U%)bSIt3ŵ` ä2&&SgK $<1={YmR.jS %m&TGOMwmHRqѺRDiɋß7T\.FO_?r~뤴U~YbEˈnߩn15y5w$~&JG!C!X5#i@{KaUT ^]䮢o?`uK+}w-ΫJtP.$`p~M/Ǫf L[R1Mzg+as-`+LshՀ}Nl[`Ԗk@^WТFMӰ킯!}x_&!*m(E"usDoD[7bTENTD"~䫩^WT#*Λ4;AgTrE*\it B!pT ْm ]*akn,^!"ȄJf\blDξIǕ\u~$o٣ۯi*BCIV J:+L!0X Ia…ʰSQdf6(Y^+oq诖ʾ/8[W`ޥ) pcQlI"Val( CC!:aB -iT0_R̐fZO3#iHq.eoq-$1G$֔V\J^8<_{qϙ4{o&~s;ah+dtfp߽_~db*Y~~Cn.?Lfx>_w "L=y~U?gO S0Nf:]~3H'=;w ~z}pl`8Obowc)Tv܊cˇjJRqyE$7Oqҗ;lj԰´0A SB n ?;UYK*usw1u NQ5MeO2Wr^gx"~oLzE b*R\u3B2DO>Ǘ`R 4y%3Xd%G.7zay^:'pl7J3 CN^Or$a2VOvDxS*[̾;&dTɴGiK߈4"WV}|'ӐN2VƩO7m4Aئ7)"MDcT%oa#v)9rO+e)ybKH1"5NaLyu6Q’`I7A}$MWI!f)R)MRoYHѕb:=U9Ad Dz{9q22OG?}(ԚykE5㌚If-xÁ@6N?dpJRu zPʏɁi)3G?bl{>g]LvAK5gM YBM8Bd 38'/OfM`]&F5~U)b2WZ6LG̜[nv~N@!ac% EHM#ixg0OK^ Y^ YIO]b#co1HYH Raki >fϝ_|i }չ!Z#ғvvt8s3b @Lo}/}t5eH z灵f /c?/}¹^8 >{sy3P1KVD2!j1WH2/42κTa/62㲝lơC#1lǛ1 %c-2em/Bm/y3eSk3 A8  S0 qgoK1D Y͸lG̸lXfܛ/m'3;af>! h9jKB~{ݽ|LǗ\t}l&j|J)%.ow@67& i%XaS,$838$Td^DJ 8ŬJ^$P ^WыA+F 3+4VыaA[ =sGTg{WGXư`Y6$/"d3șFpp&pU<ϸJ|{F R%-IʗC3,"q yd-FxTz}/\Y3_{1L0x|hqxC>{5 s7n<` }?MA=:?zӅk<.L`QD(yٯBW_ ?F#h$p=}3V >)Wy>O8nx(e< qvC ma*wHsLbo]_f]FMs[@xkߦ "u@8v-f3-xQ<2zSu = bXE@N`ܺwQCuFd2ePژ9#&7Nfg A}{iN/m"y܋52 (u@F=(d^e-5/RĢO?pQwbl}򼑚"7\q]_֚r,_Ѫw&v {ɿ>Ӑ-@9^δfڿ| +$t" ev!SXb̓ IwRbTs4zs+c"jܞPE pkct3azjRGRDyNK~vR.!w `NL+aXo^$SJ) J @i`$h1|r"u1lxuߐN}Je :)=:y85Lnl쳅N=d4I.:P. 'IlZkOx#A 3+k1G\Lr7Y ǼMO¦p$V*T SjKTZ8`ڢ&AQ;irrh{@xXD%'A1%8-[JCmZ.Ũ6K K!e/ە#.`zmUF%1[uzfEw`8Z,ȕ4ls5ɄZIFƶqM0-˝1G(s1G(sK)$~C?b$0m5U}BMg@7BDR*ҘJKިS}כoGa Ý-9cja3Y%jVݖV/GO2wRޅrP~g龝ZR}='}0pRQN-5yI]-D.:ʍH)*@)m#i<ʖdq DjW5[d2HNr*r{Q0A5m[RQ&>.#&-ԣ ]?.]\[9.,y+.Å爁Lq r4s,m'/̖|:+kB}36D6!=_|i|:H)u+kDRD4+Z#hD_@}d(1JoLf{_x|xkrk^L*J;Ql-c]HtD-nk 9D Sċk $ a@{EƭڶEm㒝6zV m5:Я0b:tHT `EG츥)(*#N)BP=?Z[iIs-(=NoKy'[FLGErL4rtEu'I#YxOl6 Bḓ ^eMV7t0v2Xls0d;Պ΃cUb$ytq nd`0pA EQҊrI@Df i:k|CdqxSGYAs6[oj~ŗ.䎽+o6H >s5I eEHIJTh[./o%FJ 6p]brħ@YM1)z|b\.3{)5HhAJG,et& ]cF jPxE7!dT˃T;_x4Kf.m%+%[e=|-Zp@soT0MXrzlrpb ܶ{leRI hC]c\wdĨ^ek Q{A͇N@1*KL>rh#,jJNpZmZmkn,%d-.Ut-Ok 1/AX'~pSPt,_PBx 4ȵ4=,f+jds˭fӇ&H۬Ġe|m{ 9Z#ȋio@vjVcNgu_ W.:ʼn'MJJ TP@'E8˃p@5}Boo!v}%.}UŪh?M?D{Q?:a9 8)M])z=r#`sz^ѱں )V/v~/?f)'gte/y̘rr{-4v#b@-1D=j!;z/~}=Y.Do茮&cm ײp9ty/G?Ln;⏣ ߏvrWv=0|'88oܒF έ>k\l&)>/Gs={I;xYv㧘z~9Qqh3Lͫ+ϵ-?)d`å` f!)q5/ch`5xw17.>G!" ȠQOO?KÃb\D[A"AAxP=]Br<Zj#xxIx ǛKӖ{PxRfU! *>RMrJ9DKf.'s,< kTdu~`d v5P4}E:zZ9[K|,VSQkl?s+]7Ƿ?>֘Wj`bAʼj"~@6G+8Ytsf^ȜG}]Kc [Bpk2փZ漖'2OѓfCʸ01 H WWL~}96WGB//Q]]W]ۅNTz,?2b7'wRh|&1'89 wv=y{Β3+/r!@@rvtTGeeS\Z2FrsW摗Y|*?V+=Lj=yoHO:Z%7|7 E?yw+r@}WYNj9o \ޣEZꙶ݅ 8Z @jkeBCxrDW?W7g ԅWjgt L! M W1Kٺj297X ٴNճI٨ZK? bj*ֻ̾؀]:ĊRZ;sP m-/ I՗Dޘwwz4.f*J%_>Oo>O3E:0z߆DF`2;~H '*0_AH"%'{n$/Īq &ϒs<J lRP8@ JD\zFH+ eI ͳrH!pB5 o OJܿGq |8+((Y`jB}UHBx+%&_:'ѵ҃aN'q{d5BIN_f?9@%_:O%h/oQ՛oe=~VJFU~{?=`7q_Q""97_pTɼqVz 4$'*Ԡa*bpZKw-wm,AE!Ǒ@}c}ݓ<"G^hIv2iZtr=rs&c? W>Uh $qNmC41˃"i2H^Cf-G$cˍyef"3aѳM;PpC {>(|$Z 2 y?_>'R?ka&c9j޷9%`~?7OjmL< ]/&>!pr *Z fǕ̗i *5v-zvi9.#-Iŧo.|7{sw79票電{o>)|xsɤ".3k[ftOI9HDXw e-빡iC 9ڒޯbNpO)KQ9%2ЀJ 1 8*#DsrZeau LT3hDˬQ@-AeqJ-jL "uKL*dg]sO.*D귫bSs;3tˌЌ4>L2ZYPR[7ש24IJ_}gJlA&a$[a*6LBeA*YGeB̼[ɠwd|G8z$G#e,Q-:Ѩds@QV VKBKm_o}Mh9PH"\rk^`w/`uU⾗%QQ-6Z1DuH.H xT$J`{:{+Ew3,npxnvo YS(Tdh"]oGWYQC8/NlIK_c (Iq߯cȞ' ɞ_uWWWAIU#JAI / R0/o&Pdԃ ֘=n[A9CX/_bPfCÆ*T:  h1Nmֿ;@Jz膺FHowu.lJy*gJk]"B`4/T1 > Ժݙ&'  bJ)w%޼ 5P&N)f÷ >LEs4MGkc^}KKc>o(Ηv2ZKjZ|NFSu9-}WTu)Y/moF2{ӽܝ=NuӶ%K}ݭPj۹TE>ģQ"@dg\c ٸѬsn=$`VmY}NJd{p!(~r4~ a= -0 ɑ8RI3¸M{mkR&XBIkpẖН7/\ċ;_~_RMp9۲e(m H:]QpʔºK)iL ~uÂ>͠)gMje- -ޒTM%"-GJŔLԛ{ #ya$5.0hاm̑8jbDZG ld8@a&M\_Rݴ7\}L*575$ [w.}v b S 2e$P,(#֘Ln5A,#L\lFQhta+{^1p;Ηfd%6k'w(XAwu)(WDCXc5 !GW b1P{l TT (*0q;ʼ@]>:5f(ـc|hqʐl2Ρ"3  K'W&|Zgf H01li<EӜkҐj& U`>| )82'ͦCpgƿíq8kPgdnJfaBHDfC-4LR $QC\d::hp&t`ĘYxr1-KWWpV,j]<)?kJ=L}a<30*78?Ԗ'pO7tF wϿ50޼"!| 3Ƿ@}O 9q (Bſ.gܚۅڛG#h A[d sݛ`Nu Q0Sw!Kw0FqZエ~I~76G0:ګF4tGk(HM,d^KI_aYeTwh*3d 95LgR^}{ kz`84)8Rn%2;JxWp' ijb:".JAaE :d2Q8P& SŲ,.$8j|8v(!U&`/r))殔Ӛo_)BW+&mMB8ikQ#-%=D+g7OBw_v^"bf6y\:G> Ϟ\:t,9ሩ2Qtikm B8i{GS 1a( 2756HGbXHa<5FЙ<ċoxj*5VAj0ѱ0^ɸ'w3a h>+աXYX4CLΚoP(KE"geA8Y̤ ["JEPMsNMD)T36JͪvZ7h$-SkK6 CK6՗0+FfC,ZY.AV* ZQ>CEX8H(D 5)Oc`qqص05 fЂujȈ V#gSeԸ aEhD 0R-Ae wMKxk <5Oi"FQYxe(Ҕ(QAbw'ARoVӟ=π0>cE{56 }ax|}?LCzEɉ<0<%@|e1%4I:!LvuDI>CtvmFBIxyиRa}w KJ0s$sH,{s^=wOiDT8Ci,xmED\$ ԥr՝& y޺sf&)/Kܧuِ[Wl-i){t|n,y^0h?;]XpxAV'q,|B80LJ.7£!RThmJA _y&xKyC[ Ip0LI.,mLPUF geޢ;2hit[i*װNB YV hQ``sQ&62*,.:lBH9krtU<̈́jWLr`oYgL9#bd$`3/$S ^ Av^JzT&z4cE0ӴLr ZiHKIWxBEy/O2؁F涥$k'߼{wd1'QEzͪ&7(us-pWwfd>ϧ;fs63kYK.Pq8(Xrq Wwn]ۏ?} c&ϣ;cBV 6cɗNj٫/ެYZܠsev54ō~7^wMtAsuY XAڋl35+*ͪ\9Z`EQɗ!@kX1c6hO2W*H|ȜqR떢wTrMNlAѨ#ҫAp Y'*ԯ>cn}'|ifMd;8W!}F>J\y݈jY'תV)|f|ZhhX꽠/Qy[O}s8@/\c(e10ɾFf:jΓ#Bt C&jDHϭd(ڛʙkoyU |QlQiAP%BN.yC85ˋ/(p]l!}/NI)㴘"QHK+g#ЅZJ9><rL\vp{iNyMXR Fv?H[3xVw0HR5hqxO*웤9{'Y >cĎwm:t$ 8uϱR_b1{PsX݆wNIDӖ٦[9&~  ͗r`#Jg>5>Κ&F/`Mt\%rͭOaP5$IXv{'T*0 Z C{6$!ϹI(kⳂX8AڽTJp![5wNY[|yYGg+:WTk\9+qD .VM;s &M{+'4! N)B({ΧC3ZO(ןuъ`c*M1ȰDXt<:fooOj JdGCmNӘ3tV:&j@0f 44BbS (Ky>#w͵ N 0w_!е5? K";>Yϗ}z̐+7gn_{5 ~W1 u8}x/ S ]H/RS__s;v˫Y= V-/9A^zv28 + @w!q-: \tʌXf+e:j,tYu8x\ D:v6ăezdS f;ʬp$br{І4d Tſ sU⼊QW1*Źx(C˂Qk+sҌ"3, uR9qAu J6I.UܵI$bԁ,`=: _`{<8 STpG3 9kBs8@a1e-zyͮTkU.jeȊ)CK2q3j< z`I0/=2A2Ś a@Rܫ/$- Y :O E9z$.6&rBH&MS(xK4,T_ "BZf<Ӝx|Bnmv@1KĒY]fD)~}s]N));]#9\ZjKDL[I9 }&^"-THqevӞ4ݥ쓀.ϐs[gutV`8@eX`v_E0w!9gfDʹ|?mvbhU_`w7gs-kbN-T`g1ĻtG3~3t9ZM©usw7&f'Ӷ9ZO+厦3r 8Gaƃs5z8Q FJ*xr+q8NHF LćF1B&"jq(Bז8o019"80Q$զRGRm5=ڢIUL%7c53 9o2Wʳ&cz7q/PxiɛO{'>[khq2R2qq,ޱϾǜ8r&7xOI +jIٕfٹ8e+knHYHu}ia{<4!>ᚕRQv5?G}Ei[,}wXCS/"@X,/{x2E7ܚ䯳|<̰OnM^"bBŷw`<(~1O1 oD07-"'62V'}*W {5 X %XZ!SXA(B׻5@%8mBɨ5,YZ\27ےsO7J?|pMMBۄY*k iN.-p[K~_އ ȿ DQk7QHXrad&Rg<8EeoתTZv4tŜNjW*>SSr$uپ]RM%[\N[nق{̉ ls=2VU.t>)D9nvkֿˍUɚA4z\(DdF"Qdz2Ĥ&hd 3p7S0vFq*$b*u6[RI8xp]Θ%;c 7_ K~|Oԕ$՛@?ӊp%پ $o0CSk4'[rU]ʛвFp+;lWf1Z^qU\EAr8EvDRO҄Msɦxn-RQCVs[N 'Q%EVG87c 2AmfktemA%:Z*0oCer29sJʈȭ19IV{&x @+Yq'daf#WF ;f$M>+x0pI>hpq˸k[]Zma' a} g;[??Б0c/ThZ0Y`-pr \-r0ǨV!|D֥P4}F֥#;@+Q].5_f.a!D%w5 t>w;f$s<'FA!S: Ǔעl8z'· 96 Bcwo]Iai[ )^h/K1ay/(x`1u西fe lUqLtGg`Kz uQߘqoIGmSu_j׺ I /tj ǩ;s`@#1-9xF[v@B{s)mn`HjWQ/dzB)8b2/_bI$"'s,98 ,uH)'<QuP_\+k$5# ۅ&:>Zʔ:t| jR&5"댥>bTp.u-^j0n)!N0U]_tB/L9/>^}44j/D[Ul04щRuE 9ۿK)G|cZzpIxHP-(6Z Vr.QfrKѢqCkUˀ9555X&` S2/`QH.8e;w:FŐ֓ I?BDcʅf^s-i l0mJJA$S_秿px!k*^aMz/7Sh%{[ K V}~|g.wZJŇ ]Rqy֦Zb2gn n)G6,-r500Z3v>OaDA_t$!䑳XuȨzfysu%bDh <gC*B\̩3aJ )3O]c7JK?yپ67NvO~m We[Z)GL mՃdB[X]Dzn޿n1X -E?:qfxl",H"ز k'_6XL1?1!Yk" (Ej"&FX,Ĺƀ p ,Lul|BBN' )LJ1t:tMQMaW쳨s+<5e8y5X"KctiAau5?#L dlb3+ў' 1MGO:,ÌpL<sui>)4;oBG˄ UL}ҟƋ hv{Cl|GMҁNRWs1dqsIv9Q7̇0+J{ȱbO)sP9̻σFpK̺t^*Oc?)etWok . w>C=]qZu<&e?P+uDtp$u6&b┞A+QAD^cM oj+Vkssz<YyGQ&5?p]C}nɝIrC8âg~oNVڃ{MB ԟSQn<zRQ^."6r`>DQt; "51c1añƉyKbW)X=tȤFJȝA*fH A8`yK f<ҥmnDZ˔ttD:$S2(Ci!06> H:#X)'VaҜJL{ gmzw;j&nuY@^_Ư.LW5K _)kWRէ4.y%& p{w ^]=]ٗ;tRvGc+coaH5dLmM_j~DZĄ(LrͼdW͜;lk3_^_DqӬt|=|ZȨXvnUF+ӝ{|2r~wMM`YZXo祣psd%ixbdO,xkyH;3,?ő,ʙLwf`rY}dȪBo_s) K Є-تRUy+7ʄQmVvݨ' ,Ԗ$&Wrж!y'ש8 `L#0/5cddYJ`5DvX^4J Ih,tу8b[zlC=ԎNh7 KK彃'@\6Bif -4ED[&6eS'g@.ٷ_YzYۯl+F3#Ke #]#JO]oH*=.Ȟ6o*Gvh!o[Y0aGgѲ`1]pf?0Y1n 8[q:} c }ESkk-1Z| IJw!iAZ0 He=@'@3j!UZ Q (ځ"ݍ>зFD'MdBcᗪ7Z 6jxFSһtbMdpxj-EtOm/(yhׅZ|Ėa73f6ÑC񖌿fc_2YlcDiU Zo I$[Y3ŐPwNJ3ldf}gg= |zvcvmJcɟa {$o.Jr߭{c|1^Mc΃»Ί8ůDCP\#RI aZVY1Il>R<$;? cݏo.S~~w?|o>ۯ:"LPo^2tB|?D/$˦ʋݻ@9K1F ;8a z61texD&lcs,%ty8K8uwGQC p1[̖Dbk.>CVؘfB7 Cpy2ɼ>$~]1 StB2 xmwr0ֿ6'lIdְYɀUs%b}=2dO4`¥sq7BHrjh'VFBQգrt tj۟tI G^ אZ < ČٗfCj{PK2}sO J-}#$C{7vuC J ;.}z{89%\ʀ񖝶n͆ad}&EEE7 禍S= 8=j}ooT`A5%=4- L>VB)uI(πׅZGW;}Tn!M#}w#ߎX}w(/"qW  crAm w9Ko,7gș0=ZϮ:)"̃x9bDLGd3XM//{w&q0Ow?0plWf\]0,8h x/? {|=٠%zjZz쑲e=@ ] D^F?[/6FPiQ#*ŪZbOk#_sʾm!-xG H֚G;ךkdB`D(suGֱa6=mtMPE`Wz=U!>8jvr1 nqw1uzop5@*8;+vV?CO?AhZ0Upd W_]T}+⊬ڴNUmZJI/#D9g0&C)%BikK9QQY^`r$7 nonOR/u ($L4Ng))`mC`cv,0N4ZoF0֡VH/PA+`#sсt㻬r;ν(ACDɔ>2XO$:T "{"]:bdƳ^:FZ WoXaDT訊`jGWygjDžEke! -FyHqOFYO-kJ>HcY (1b00 +T"j. .'UC:p[Ԥ$ i|\.9dghο5j&}8Ho# iiCu{;s4M90ʑ_82[uOI»v[ z-NV[jI ָdE'%3 ""貰b!ڠO֛PY'#TMǃGևsO 157A1S^/wJݲd <+zw;G'܎s¦'/NS*ֲLü'u03iQě iKf?FדQ>` `#Y=[u jЉGyb*ٓ2Q֐ETTR˵[GQQu?@W>E~2*x #τ,> JWnc7RwZyn5}9SXB;Uw_G`/w/oSc"ݏג iĸ(7.2^ZE@pyEڰ e7jsf&w8cEwpsٻFn,2ΖG>^3HN&_60Hvk$">J]aURۍFVHEc mI wi90@QBȴ(4g XR 63e`صcl=Zls mgA'829NT_2bDYbLZT(ȀRGkRP!lbM[eV//+?h̼ ]Y~sqx_92 h|3?\^4&%͟cpwù CeJ[v y,7>D2KK@h &F^HJmb1|U֏~0 x|3.eËh8LgcrEz1F!GҖ S*[[$׸n~-^&+Ci6x8/7 E|d "XkBKF ~5J48p?E-T#~:zVs䷋D6Ŭ؊N622rbz=Vp=oU嬐/Gwٿ rs6٠{{rɍ./bLZ 4g[,s<:o`݅Ηl2E-ƳLTI]z+-rOaJfD=SVKuX $ڡ&ik $Τ5eP!`t`jնIuLOō"fiX(}1%eAPB3+_=@[AAbukO|=n VM5 MkPfVPiGU[VlCȎ ՞`HPdRI5ۻ@x<Rֺ% ˃."L&a% HdEƏiA%ex2f2$m0_ w7j.eǤޔdT.B-jYP"ZPZn؊ A0lq;Wc e@|F','G$u `CFS3ۏh ]CX]U:Z>䌪ZY{%{2 kds<RokgGL"p)MJ`Q2-,Zvʸ6|S{q+$y#R*xqu;d?.MX4:|bF sD*w9T:|>iŭ˹2IZ(.wsnYNϢ>/ɨO`Ve[hPOfzg}qWoO8`X,Hѯ}2JtݬS%$ cYۣ>ӌk@v2tA0}(4S>C)]yaޗ/Rȩ+SHiKd24]7^: sxW^*3ChѴlnS -;.'Wn.RP~=M7M`49/ 2Z>xZdzqǨeo]DPׯTL=oi %bHkvBor`k[S`mmhB?rXW/򲃃E%bjT}{sP*T}5îXh@!xxRwJrކr!R3$ӚSyk)+/k` Y5xwMfǏ=ֿYӜ ,ʍYq,?sFO?e zaP x*PTXe1(QXd:\n9NjI}'wӷ'ywcn?skzqopM{qtF2&dOvq=IasB%ɳR׺B繐y.} yW y?I.54jM0[((. 2D|dTȃMъdKcHtO.l9P–V Sˍ:A9&}zwI]fBs,c9@MOth69\2" (I&E I) d$<ɓyVH1y7Lm ɣO1faw[S/. X)mY׼4B%{+mG #F"ĢqHZJBH-3 rn .8+i|sif8"ԭAT*Nr)*OqJIx#Hsf`-\DȄ1)$2+ס%TJ=[̰ϨL>uxf:?i/H_c&Kzu/NN3V!A .5%ߠx/e-hͭ)q!ih jeJIF6,5 )x͌'ע!@+4?5jA5Zg+QAbi `3ceIncLdyQȲ# UɅVymCiE7e:VOq/URPE.URPETOjbBq eG9eT |nDk$gƀ"UKO6V@|MVAtг\ŜCe|4FsZ\ +w%hw%X5w0Y>^eu0ٴ 3*O4Pv5i:ו$<'!Ï@812-ddmBڈz =mTz5j4* f}kz1x .;EwCYJΗ7?ۭsQ^A\y5УniXS4"'#chZ@ocАw׽H{ -2`[c # G7~w8#$#_f>y$ٚ&s\=Ǯ}:<7G7EOwyGOj;0j=(čz|bzG6,S`\oʙr]́uߺGa{v"A..E(oiʱ0NΪGN<53Úݸ2i4>r X(f]ˆ$ hDi*` :][-6#%n1iE c vhkZ "=xS4FY;Xĵb{;F+ }d_;X|Z)0L}u|&KPkߋハ `V5 B U{E>jr]FK=_Oޫ&a i[޾\I5/* lմ}@ )޼; <Ts h;R+io֙M{43}-6vy!*-^^<\}ۻ̖n)zODS jұ,DNpig޹E\DxXiuyUo.C@`qmq%xWb6|.N)WB[)×t ^?r'[8j Lv8$\>kA^UA)U~.a^UQDa HnԠJaI%hE Vx$GJ11Ԑmse NKz^S|lë;ӭ]B)?y;!>umK{zqXdzO 7_vwcsSۓPm)Ւ bK+SLmӠ{4bNyV`sPU(n&NP} ~'2ʡ1kS$@޵o|@oK3mfЃ9.q$ yeʣ!Do +4zR!Z`4 N;[;@x["z-[]z7@X#1Pzj#bHh $.mr;bM&xXkdPry|Ъg,yX H;d {ɝH' *Bpp9'aRq塃oeAFt :Ou6axƃRI#2@(5L_؂]BY_w}D=H5zf[p QTq3Ԁ+Pڸ-T3D-ofP]_i0Je@PHʺs:*ˆP myi.daZZR,GN5u@!JrTiPnwe7D\ a/:Z&3hc9Cj,Ym,gapӣr\;y8'vH@͝q?IĞ*{Ka\4hrsXSI9Et\"Js*cxF*%ovQ|s~މQi:jeuh (%XoEs؊Î .1Mqdpԓ}f@LȰf20ŸN֏ޜԊ~oNj~쟆&^,?).SP2\sZVr7C۟rkb̌h`Yt4pJbHDƢ5Z؈erT'"FÌv&=1/T t$i% 1ARBϣ9>3N꽁>P7[>sh.˲AoUCi 3bAߟ51|+mE[HgX=At!V !gﴦvNaQКni.W Q(lyf(Ќ[>l9Ԋ+[v&}.Ŵ7>*SAE¾[JȼE$T.]E۲Y?@" 4iTlśOڻyqycF: _(C7ơK A#9U/2$8>3xoTe4:~s+=vGv vg: QyԺՕKLUS9MLcpKB5N+H֨8x$7srPD-kK><_ |+g /r#*W[s+3Fͨ}:D~s%P}WinR Ty& YE B+ rk~8iG P] 4jZ^Zh9 B%#ϪhЁ&Ns(! QpǣϤ@l*qi퓶<ܚz#blQ GrDZ>qVw75j be僒֣樛(G.I2ͺقEwF בx?ș%ܲhMT3Rpa]}y-v7 "Ue9"MJs4xLNi.2'1UnYU[ΎM/0H62|uN*hR`12iZTkOIƖS&uH:N.oW7'9\,^Lie<7'g0_MVt\ %.DV2ˆ?[F#rG=c({4ѬΛJEd0^K)wJY,/e~qw.D  '.5&XQJY` |pϤ=l/W{H2`kv hI)G<Y6hۉmX8"H%f~p뽯h"S!S4c ) SbVnxrRpZsrbV-D|y?jП|ZFr.@Ff0+XEc >g0iEC|2-]+!za«gT72J@ݭhwLrGA/s i70[C(W kf _]]7ڴ;t"cPdoqdG Jz*j^+>'XB{Tjoy*]o8Nea 'zl.Rc^}McGgcaLx! q~SӢZL2hmn7R b8-d`~+"Mބlv4x|(ъovʱ*c(+@w'7G.(^p 3녑̳ VA-ߪ,;)*jJ9,Y4'%0)MkqK򠂏4 $J@"@i5PmocJCx&%Tk^f,Y41_RRh#\2Gas"j0e>KaqҐ]LOI@ Yo)0-R``)0I3f(ƅ&YZZRIS:]W9gi )A[*Q+z_&FKK]a9CK+#SʩDt*,CMN,iTE_HY k\s(~m0lz&hc6++*)1vۊҸA:@$``Qq($A%N 1/huُe18m* H*ؐ :4$E1)E Dh.ty$<2hN6Eҗ,ʒD ~4 ڣB0%跔-nS>r7VFyA pB ꭅ;GJ\@FKpZMV8Qt ĔުPE] F+!GL%OaRa" *(%(:FR&-49UœK~ `nk̉T@F sr_O'#Sh#ؐvT030. ;G^UW/R IJA;M-ZgJޯnAump)/ 6Dj =P?h$E%1`/#ZN&~K5"n&VbFb(%b 7Ǣ+"F I?bBmiqOZڬ- LM{48(ͩ .̡u=y@`]{%VTݒz[R&L1v%[KٖٞsgUyE7 *nLUdPm*2hT4^(>2DWS'm3h\GRC@a={HeQd>=49l3NiwN$֟#LII”/я0p>Y'۲{I HIHP/c vK>hKGQ4},[R ߇!FzZ_QqlOh<_k<-|m^$ yTg+vW8ɶjl&c|[QWWjb**ۛ`P.g.]AxgmE{@勏vڽXEuʛTsbɃd+m݇HmSl)Ϩmis6\rv6 ûVC`~N1K8=$8.zDvײ2Ey_]و*-(यnݦsuiiPá+>QU;Z=:^KYQV 剣;~&EJUw$GvD:p{$' uCxWh4nSFYt'\ x^Ο>j;6g )Ϩ) ?ShhvWEIh> ¨/l_`|Z :~4x`| R/]ᤧ.Gf2 ƧsAsԍe0GZ Qh:NM60z$1UY: R+F/HN>WDʱ?ڜ|).3fLת)uŁwlȔgԬȴ9V$B.x oZC$4(㩘*bṐk6sR8 )ϨA6<zg!_Y@ԑ1"M4=}D3: ҷ.9vEzSFC!N)/65lӆb[;r r FLP1=Pe/<AsJ/殤h JV(G q!l[gyt*B!"̋OGwPHJ-VMw Z@{^k*l'EVV+2kH[ (EdJDAJK}1ޗ|qAE <^ƹ1`w\E(y]HK.IB! +n'ͫWĸ?eyWosqO.Ԝ[{g߯DXPt8nao?hͪOP孝][SAU4uD!rn;Tʫ1}㻰Pj=+h좄dge4#IL<1gDX{]9i+uv֒fN4I'&q^**w&\> kTGY'GGS4Si`ʌ ^D)5O O#{y0zk gW`>18TXzG.Hf?}!9r9p ߾zǑ;J]/)dc/:~jf7od皳'خ+knH{jk!}bJ@iwg AT4gyhveYUD6og;KDBMXKXSP+ᴲwI{Ww@b"I~(IJ9eDaFY$P뙲&dj|E!aboPb vb 'E8@=w>%y[޵q}rR!'P`yٟb&ҙXLƒR;q] 21( N&i]oDjN ELj> mo\W'|mКu 7UPg{35*EPcYƩXeδRǰߢ `GWE*]Qj)1mQ7Ԍ"#oZ،GX s%\{e:$[DٺZ=5G~.b 0֙ qCu%/4UIcɷ4MDR\Ÿ. U$4)4wR"@pǣ H cZ-<&%G[d.ptdS:(*C:_aا*L~l+ ZmK@tpXO(h-҂㑙O"w1~E#QRt?9ZKuc_[-E. u78gt4Fx%r4נf'OFWQpN v5O޼ 26t¤UV;i6AZ #{A]|i|5r(\p>M6莲xnс!"7?.'멝}:O7)e6Hfe^x,jG/za[?/KW%U>;o**KD0^ ϶X] ͓;(Y2Wnj-u.sYfWN2%gԿ;;pyv^\_.Vc?<+|xvN`>cKޒCj)Et:!*WIIҨXxL݃"iZ*|B<,zʂP\$OEލ Wy:\Z-~N*<EO*(S;Gbù$1AIק`-Т\)-*i- $fk81@UࣵScx^(f"$k!)O$8 nP(5cH k­I(fI9>'">ʹX  + UHkh zv%mZ<5$zcWfi0/b<@Eh !`xՃ@ `dbx\rH=!pO1ƨ!7bD g#1bK mgۏ=Ax"-TC5%8·KWeҊ lB%u-8ܛw-X^=.Qׂ&53>.xH q6D F{GuJ5͊/8aj]gwĘ88H!ӏpA_Nja#R6Pԁ5P-Ors"C@ $E}3ƎHCy  ;=;"nc' [ 0Ѐ` 9Ҁk.ګ&rrؠ SAMe b*O>mF {`BˤtM$Tp.eT\yE;`Os3^#/.QcWUr3ɑLXԒTY#:B]b (v#vIvs|^ԛܻ tկw3RˍK}%=;KD@xLI W)@Yrbs8;bb+e26?pAe&2*e7W ۂپ4A[nPj•8@x'NL/1m \jiTfؚ?Oa!,Z/xapP3S+q$Ǫh_Ӊ-*P}F+m캃v3d߰rKVR׌X 6Dv#0'i5Rbdw4 3Lvʴ׺B|-*jFR-RP/<%1&%mFS#kmKhNS1Ӆ躦{ `G~^^ٺtJZlŸG>7+iLѯ^:qۯ 5#8ɭ˩ՓtguXyZHV:\RAmvԶ , MY"W$Ieb$)ND7)d۬+_ruʯ@fa  jƋ7$9e}E}K6^c_ $R֠ jSq <{$E]ToK` Nq qrږN3'@q g:x299ب&s]]Hdf!(Ƕ/CzFyذbWHTnPo‡:]S?YuoS,Lhܣ7_p[.P*@6%b,Ҽ ם"u./bs}jbwn{(kc%CNJN:!&F:]4+_if8=h.n.~ix5p/ -M=wd]Q~m))J!]K;F!H|SM|s -XT=4JVɹ1GUMh~]V zqkd,Bz:V"=Wp,I^n]8ܔ`v!i1:C9XxYbIApHDSbbP]27vV&D9F_^ M:y+IGO7_LZ|.|=EZv*7Eǀďg[6˳ko?yߜEчˋy]ٷg~~>d FvN3Z9y5YW# OY+MS:JOEbަ ŖCݨ?gm}9- KN^&3N/ *T/ٖ:^_(5 ȈF~j+p%y_tqCRXLvc]n}EV_J',J9lr ToEe9:R&$)!u!Ck~x5"1@ `?C K&b>Y MYAd&IDR&ӉGR˝z]J]sHђˎ`w=ֳ"W?JYxzoJC* W LMPX('Z)92Vk/dK5I, )Q^.k1)K.?RW١R(x5* -N- -KGϩhK]nv\:v0 -3WM]vvT^5ԝa̩jQ'5 U5)&zx80&N1ZrHhR[C얰 ir{hI'|XFD*I"r㡱FIpH eH{gtN1=M/Z^ G4F-/EfJ^[@.d2 4mmzӿ$No4t=*uz9azɻ&dBgjvC/cq<-ОQQG0hhr%V0;|<;VJ}ly绷JtV=ygZYd?~pb~B(PVYt Z?$"PЃl4ZbkU~B-% 3=4q[d^}(K=U^w \ZS)9ׂw`rk!IDmRd4`12`J(dVFLAP((Xn,@QjNBk4$ 2AQ"`!YT5 U pf"( K8F3CvIZ&StV6kSLAKFz'^ [ B2B A +RH$0@JJTf$L`N֩ʮGV?w*?r+@Pݘ;ξWgju}|P|wP+ka}?J,f }0.oKu9guV#mG "Z:p }zl4hKP@joo]7U)\Inb沕z9 4rRǺa_"[9x^bhӶDzᄱFc BX|"+]p v_,Iv{@{d:l.BXKl| dc1 |,2Hu~|i×@X N; dZr?ŔE7M1 !]IϝkDc|mպB*?\˴UG|%iy¼'ht2ցuQ [HQ%H&HAAV1nrx4սnL>f]U dڃРzq̟.i>\Jŝ[E,Wi]CT#^]4(o^+1f'3O~-ލe5&F~\u,̏51*CJJBGS2z )d^zr666 y\khwYZ])`IM+Sbx ;S% S%S%lֿ΄rߩLypl`r;dSߑRM?kUΞiTgs9=j-oлgO<{Xav䶐?3B[Ts㎛jͱ&0{=_=5GKSN|kCێM8QrR#xX u[Ie&yv*Iew.ڔ~B8G}:,\%rB6ϱ_s_,%᜘ b2@v 8Njw_^}!8h͝;4-yGB'3YepLNN@  At l(@*Zt!yY~P'qzB xP} GJ9Lux~"+*P-q䰱R$Az[ f^K`s&@y ly9^u¬H>bj?Lm}I"DkS Fa pܚamp-P(q4my?j"Úgn1*jԑQ4z@NH0 q|Slԟ:F:C~XUJ*XEBr.cU۞bUVptȐ NЉΌZVcJ`%enKMJv:jy*RqйlSXB-Ox**Xx Z ħ|!GXJamo|H[FB6[)d&FUhtj($}|0䤀vNJjKs%3Kdҝ7C;I)%JAvXnN91HwFlSDjo `HI^yBK!&x. (RْvM&D߾b/rŠ+>[Rk}uUѬ=^ݍ#s+zj_YCߔN~?Ŋi}wz4o_vW}vɐ9Ҩ])`E N.Zî3g;#5w% &v_y_jr<1cCuvZ6u,&=;^Lds]6WTek/1 L?lg.u|r) h4P/A $HIjb(FGN ŸHӏUY%3&|ྴ6&%lu.D9q(/ld`K ( sɯi?xW0_*`JP'J9$ǟ uMo8{k6"dE,-[kZ Z("\b M*&) RxibV s-29Y",8AE $"fu# Y7tR)M#t9 V IαDL''H2ƫj>4_oT*O6c!vkH0]#cRLmn/[[|c}Ǐ&=I,V#.?rxc6OFLxODDh??2d:[B^_OqT7B8~{~>?<-;Y00_{t;,0m`/]FC!IkJ q/MLMŮuwة4&lØuL7JX:,}82(r@5W]U/`v;hhy{ЦFf lur۵giRk~͢ԚeC*Zkոemn5K9ιC](G9@Wyí:6ݯH.HWU*iUG N'fP p `i sHN(Pr3Xn|qn-V5+ފA5@puTr$0r\.P幑I$YBdq)"qN UUFw*V^{ŷw.+> BIN”D8,<ъ iyӌgd)M),8!t*TVTKhףZ'b:D8rcdc $*ɹ 4F<T3l>9\j5 ;!ǤL(f\Kf8RΕFQu<SO!j8Pڗ}I1ɳk, 'Y23{P837#eO8acT%LF(3Kal2=lFug_m̉ ]aZg<V{yB}T$vقglYƲ1XL>]d;!%>X CFf6g{fJ1̆#j{>@;UV1sƖvF@Ƥ}x*+22&T6N6bXҌT"NtO"{s`p#O+S $j/~;a^`N2kxb"ȄX{da{_#xk?>(@Tf:amz@}k*YNbUbDo`U8]yaV]vHF$R1 \bA󃁠 g9`ayY Xizl[a SҌ.8:֧VHkZ B!;yłIB"%u)*1fG?w^M<\~Ol>^L)Te[fj0O+KΆtWO;L :dK^#mk2e8HĒjm ]o\v؝ Na%{d% 6̤#geo; %:]^2DS>4YY nMmӓ:Z%cȊ0c ~<"q\fWw~ш@㽽l*őD8\z+h˶7DI mr8cb!:]e2 #AQql0JRLHNޚyFvn 1ȡ^@Ύ5 Y$ ˄mW^:d.pGы;b{ p?๵Q^Hӱ?) !967H0ղ_obJù ,(i~"H%R(%I* D8}@R{KRUhy'(-3}0{}~ScBO9p/\[zltl;M;.1cMOX$|@nm9 ZȄ` !MB BN% (`av=5M\Ieí>W-9D,ڮp9nyB!:X̥T) sfLA5"iXJeRMrZ=WNJhІz>Ʈ> ڕSrN+gЂ_ൌZ , q+eV#.?):-33+s#72>jMy<{ Sۣamob1#[B/3ZNmrcOnp FB'2eHE=դ&ٔ\l#A"ͪ,uNY=\ߧ9]1972δf3+XxH/4>_ );fA趍7JӇ1nc̱v}fՖ+(|qO-4-rJDЅK/;h[` pb/o=NVE4pr6?~6jaM^2O(#"0fD [鉠wc5`LKW{,ެ- MV߮oڼa$gAE"g6uZ$ܼ:K]Z~>"{/;<k6_N.[X]4-:l,'{ ֡kAnҿ}Qy/֞Z뛿*TeTqMAewƣ3 G᫟Wι)iɤ=XQ3.`W7G]2WWWn&z6:'`@WohQ?2Z"-{kzHB._aڈ/Iֵ=II{u)6C .8H*$CA+5ڦ~㞤` )lE _I"%FYCM t"4%5DD&O1NE'-Ց/qi+Ԗ(((xb]!F;ml=& ւ,@iT#sZ,){&/ K2O^Iy~{5!QcBaoCS.rhEMȡ)Дh^WeRZ8eBeIg /xИ?V&ږvʞO;VH9T(zj.!E-323D4)&?:oϝc]O 0% z-+ 2c 5 Rѧ (5#.-gŃCi"j8i% ާ8Y>z|u:iuŋޠ.^mN81t!a ν}0\\v%a.xdܢb,0t76Do:9;4'\gwSNA$(lRk(3D,΃xRڹd BHj3n^plQ"佱N 1ˤ)!zj(n,TI> EAW$8rI΢1u/.5\ bJA䁋K.VEX N:i4 aw%Ga}=$k[tkY^+ 3WۖR3hR MA$/-Fs"m's*z4&tFR7`m|4R=7ZN[iy;zŨ44EfZ"G~KQh7dTŵ PڙmL0|fDsV-G_U*P(z|&cMDRhZ⪝Zva4A(,:3=J&hg1.nW\o;")|3gk^h3 @b"!ZY ]|:CUDm,(F֕mT@p\g1(d1= 9HJcBTA yWcs?b<ZRZCꈖ2+¸9"zuLr}3k:rn)zÛ92%8gdPaZ#~<6ݫ-sdžш/{I/ȡ14;sT>Q"\^Hw=)%EGu4l5(u*Q)mo!b>D GM%Ͳ:|;O|tӖWMS_}OrpTё|CEf-9A&3Az*dj9x0+/".a};~,$:er(eeBeyucRp'>]??]k'#ͱuv Y$r>_ Gty5pc\K!1 _@DrHȴqzK؃hҽS+2+!XA& YBsoè(vxjZЦhk0;F({dUx 6r*{ yn wYO#iJuGl]{FYp(z_,c> $q]wGi&Qp}61Wsw{2%wj ;3a6vDb QT&2ΛEykr{mue܁ZBEpζ04Q]sǑWPK M{T9IJ]s.<%#)[N@,A.'VdXghBF0@ajNjT*Oy/p[L)++YV_6qmkil+inRMY@gj@ i)M5^/="P&n6}6j٨{S}֠io6$UO'oe`2Hc)%)ᢕ%1° B VŭuQgf^%hL2ۡR!BJV7$uO1|+}͐ no-Gl&~HoڭOwC@k3Ӡ 3"y'fI@ ]|} ۅQy1GM%Lt*7ĐQo|нG&V 6i#YEy 3q{KSZ=sp"(Ͷp;ϞƜ!ͭ3y_"O5"nJ$vUq ck]z0bI]nS乵Z`=bb]y<_IYhҨ_c;*"P-#IhMp$B XC*=5i<2weǕUmvvVڒxNQ'.{odJL%8n'w qUFW\R ^]zyzxq)pQ"_:.x:w߿ǣ.դ'_}& 2)?"0|=hruiVkixkXKȀa'gbVvLFYBfOqvB2d%'"Xq+ FhLKg9DkGhfg5ZVuYeڛyU醛Ss%'UǺqW볗|^~YEVӗ O%@Njϋ}h9#txFxyq->g.JKyp#X<%y1\EKG7MFpˠ̰މ=R̉o~P=~m;eMwtq,1fȵ0#C=||rBQS̓bi|JCy>yulu鬇syi*Pm6ڜ0=cY=[pqƁv^v]vH%oFpƴy;3ܖGN+J f\,7#*6ube{ω5>Lf( ffy^EM՜GfW K [==81+L'zT{HSl8֝CnF%Qpǐ;CR?/ bF\Պi@˨pQ[\*5Kj 4i6./n ܩTih4Z}ƴd(˯D%4j`c߬Ā1EN$VI 4:%Z3ƹL%W.^]@'#9DtԤf/"D= ם|hWemU^uUiAXieɿ XѲ,gc-Bo(m 14یW>iV+^ɨ$#hE|R1VP甲^hu0҄Al%q2qT!*Ҁ\Kl$q]2S w4 >[_֜t.iNW_ENK4cӖSyRK'Y탺])Q?lٌ\?PzO~ȚoSeJqNxhקoEz[_O'??Ԣi;Ov?pc2QS6x-GfD!ZjNڜIRM=yVF*#@˹[S?4*`J$, d+xt=Ju[cIwՖ`b*pVqZUocŠVT%.O*4ID63f`D "T  U>ENPU2e,BKs쭖z@]zb?Թ% /:ΖwϘCKBmRJKs.h9^ΜT-+N% ܡfL_"Ր|ә3@*} BdE^-r̀+b%aF+9b̘sFT6'~~Xn^Nj$sr W9=Xp] zY,zPzևJ.1JA Q5e8FU5 kG)i)J;rs!CsF])pJ(TrnPOv]+ =1Ԃ a@, K(7&kzv\QE]HN)W 0-ē s`.ػ6r$Wk,fv7M68W3/WdeIj9ĎZSdU,>p J!`*D2~JA.u ͅ3]B[K܋{#Ϋ|0H|~R{`h=Ww AJxQc ?$\Mg _$|৴Xnk;iH on(6ٰGT0M(oO>*3 'o}qK @;'tڹ(.<]2 D?k4ogHɨT$@l'ևbi2'^@kZގcoNl2Z;n,=&#X:%ٸ:0Ʃ6E `8!$&rM_p>nqw1\nvzq3 .݋Iml7{]̾."+FE"K[ߔo/45,xGMg ԅ`V)]p\qp<0N4xg8OfIЯ'l0fıHeĸ'GgL߂ORŽw:^vj9'ꃟkRD$U̅Fޣ϶ؿ wfW\6CV0S$d#Dfz{K t"u;6Eә~ D3ۊe s/KqJRQ2ɈкC"8.CN  :P5`^يI1 \0WҒr6@:Hu}y둫0Tji]IF F6Zaۏ`k*?LNlv8̽k ,yBKrlqMnw]]=89=;=*V>&!)leZpuP\3fLM8**%!r/@!Se@rNp*1,f>4 jGG#9e(~i#))Y)Q*)j !D1Bz ]Tf ݙw5@ q,!2VПLGLR[՚r.0 w۩rV9}* ̊r40Kr>;e4x"£ BQyuR10QmTsJeE ;iwB2A1UrmCgьXGuwZ װk-cL٤W=r* 9U=|Z*A:6n@MH$mrJ<&e`: C>ˆv(NAL {w)i RF~=o3J`0 |p GCmrNk"1&ǘ<%DP]޺E.. MMYx@(MFHc8$x3$F l4w-81CD55LlI!B՘?zL6Ii+ lxlo?y,OblQe<ӂr8Ec@AƥVb`[+pFYvʊ2JKɬ0'8e ( 1\Z>N {Xnjus;_Mn4{9gs:tN~(Aj%s@[Dd]-x^9j=Fh_IG aNEl'M7 Mq#ΙRX35˄K܅=&*w>% *mOqQ7 ;W'n-/ E n,fWdu12{YO1*LNnäᎯ[=Z;ߓ9yZ-q~gݠo\kO5}+ҧ4MHvtȑe}_5*n!> nj\O=)`fE$-6jQqe`B2ޖIJeh(W:Yqmk.2/ϟ 5p@PR]8q7ÆD;N^ؗ-4эZ܎|H7Qgbʠ1I:*P}٢$e3cq-:c:ʯYmw&?lW.JG;;we4VH` kɭU]{_%ʩN F&25JI.XL/X4PKN{|3J%'B%9}؂Z|b.={) 1y? E` QPUBZI4\)fa sDw!4KMBz SAO|;FYν_!j-^]bH[W.1$=zh)F+D4?iB4"B@B ǥVWTZ+D V+ O];귮[B\Ƀx-#Pokux漦/Bk7YûM1ݓA+&uPЌdk4Z;TWe!\-FHSq1zX^rj M(\Ӆ6J03F뒵&wh&ь65{4\4b0t&4-;IrdHKM2yI D3TSr3*{Pr_B{ aKYʹ4ŎwyzungdyzSG<8#M{?5Bf7urȝWjl"Jn'[cmL?+x_Xf*=mygsN GL֒\r^^*®bM:A̝ORSZy`Tp>PʶsJ-jSW*} 0)-u(565ՎD͵R \D. Rn`5 a%28k Q2O,%BF+\r$\u%:bea协5}'> *^>`\Mi a`c)|9} VB-ʤQ! e@e,…Ji5(&G\*m>CwMS"VA,wLձO0mmTs6lbH NqDfVLzPŃ",[;ŭ`4Few<^\T$ӿb!&B)xpQK8Z_׼ . CRJq$-\З^2)Ux`|xT#TR0aĆiPN[6PqKz~n/9I0\R9IyۻAIwPPLuI틫iLzciThRO33  fY==>ev6RWۜJ{|nz3h~{$wc̐}ʇ2kZ hIc1"-z;sӫ,'MST5qn9S#gf?#jF|un6>b%tڷ\KpWO`Vw~ ՙJ6Ӗ%C[aNm==1^}mqBu}:|zp-n#H#2 f ʊ&QN>|>^ϹNn]m<96DXլl]4_۰tm@{׃l ^ln 9*X ֫ԎeCDDRQڐJK@[@fvGaPvDiE-UFydeK%W )dY:xĔ"~MV9͠b5=(p-ϢQY΅Km m6sRb(%FOb閕:WQo/5gU>2\`JƌDVe⚫ef-zss; Wn (ͼ@ڡ-2W.muE9ufNMU@-k3!2S{a@00m_(sJI]n:: ::7! ÇIcZ0cK{k6h{+dD͙ [ p1 nR޸0Z1x >IN_/CVl\fw_+'T0)yO4_Gt䯶~&;ªظ7_P opjM+|B8W^GtR GGTpfrWHzۄ}M$̨~Qȱu7o3ο:%4k#z/XTm1|RCh]{/ _.g"ӛ8o"IF?0nȌ*rqI _?8=&ifrWʖ0WwK(:_:NWP.[eޝ P _(mE)E1l%Ys~)">UT0!"AĔ587̥nsU+;\ؔͦh1I';qtE\7L+&ɺxǫ3qPه dTc#")v g(/{^:IM|ݵZF+_*xob'No~~b4WQ*lO}XCvQRe~|c1Z6e}}Dwd틊qmz:rEYU@,?)㸋xw.qE8nB֗!8[y W*9)xT)5%P`\S0h3{tzר&]_nqqt0D`:rm844_s2j/j~9_zD t6],C)o-|8bWnPD)HkUV沲BH6HVi}Ք .W=IOG(6rx/]x"K1^KuQfcuy*q{VeJre@9ijfQuG]_zˮBlkeQ8G`YKKx*b@5R Μ9ਧ|+翽y Q Rty\{~Qa?mx4\et7 wx?~y h2-8y<\I3@*)~gNg/DMwhHJ7)x[L$_=]XPlZ[,r!,]\LVP?b$7s!Y"K$'`H0 ڪ_ܭy8`>3 ӫK!+%%xq5x6\Ej{\Va<T0H?ܿsZгB m?~%d7bD&Jm`R Y6KĐx\6$1s ֛lIɂ8BeAF!RӶۡ՜;ۢmݸ /zEݼ7-aB c]Atݝ3_WVNX*{F-fTk~V;#20B. !S:w{ʘG$*S*(WG ء@U U:-{юlp><?MW>t?v/'"_N 9]i*g@hhTh*ںhe&-zL {]h"=w#w5e+q[ 7'} ,d|A:sfєQ }֣KE$R6vZG+yE ypZQcIP)^hiCCi= Ko1UqIҪ`( ivu^w|j I̞q\ Ih- OJ3 D&r@Nbnhy4B.@b q,Cv@Cv c !TDZJ!}FZjt#KpV^*X5D c1Hbl,YY=J`'"xmTxkCW,Hd%|qj: ;.81WBObJy\:JZyrKLr:XR})X~vn?O5fXkxHJR)$et)0 N1f[ѡ>kHS졌j,Ev=f8cN[33LB jv"L&\}]oRƊnyz.6g4Zu74bܠO'ީԾ;@ͽ_p#@.PJ}haߏb؊b ۅuSx:Z+>{*4ɗ"(g%\Z1)庹ꕔQYԮA0k-.-_hSF4oHPʓ/I<{4Xr} m-Z>WFho hTqqRs*iQ(8҄q%O!N8(16 + l_ B 9&Q.(]"~Ij*қ_9Ԓy*r([쯚ufj4@C8;iZ-~ͺQc$$Ko@"kkXR`|5 Պ9.*ZtJjq.r.p4Mʱ@wOИnb1s@q>`_B}#Z:Su+;yLl Ii%SSʭoq[yKհe&Ol];3]mi0yw]O茮݋]Q7fth( G9ևM=RdbYW|rij/)0mnY R#%VD|TdP("-1)8m,N\Ϭٶɬ !DʼҒZ!SL5! )rBsa."3,*Ke>EZXFQ+6X6k3>{:F[ZSKէݍJ DΒ2e4MN#,yJr LzpY)g,.f3Q:@՞GS1CM*osL).by Q؞pNyɓ T<ξs٭<><=[1xIf-1`*pGD gV[ 3% +rqS_7l>u+[ ͐o|?+&it].M)D]et< 'l-QMfU4 ^lEgǥ0ֲ̬޽,иa -&$@tӬ~z#L?:+uO=)rDL0ͅ4uw d6&a_" %e"4S.I'!#eɞA^:w Y8ތ4qQ< Fɒo6wvq].7ڣ/@̸"+/V3bωp$6Eժ:ee+Ry)kiM yJ:p, ^%z$ I67Nj W(0:{~a{gm㜒g~#JayGgE_8lXL2zO&ńn*%300RdiS,Y/>Lo K0l:|XVQֶþIǂ*[BaɺxL[B3It(Q#qDYEq'0OO={f*NNmװND=6&iGvqLw,>)<"u$pKi+lUWdm+":Xz Zď&M+rt{7"(-W|No4*o-u|IJ$+ex3Mj9䆠ku[NB"PqFQ kokokoko-F #3yRqG"% EʁF e— G0hQu5\wH2Kn.n/u1(&."* ˗+8 Q̈vR@SC _*MRh#T()IH ]P,D\r ,DX //p@Dr0q;t`0Rmq5d"e0U˃« @ 4!72؆9c0SB5xHFЭ 1ךZpT<"8a ׅ B (QehxF.Y&i=/2#89 ygQ$'`.>lֹkA_]^"& )GS5F!A9i I%̦gK,W} EGìaUnd͞dNP *%Il$ N-ZY%!q<C6lqNo8q;P#/fycq(͌.$.#)I%\VLhϵA1j4*ۈ[=8͋&,UL%B^"lTꌦ;=Ii)RIbkkJ3s!i_ f雋V#I>gd0gOn6xnt1 w_~Cs!0I#z*R_-:BcԗMKH9+eEPǚZEiL'͘%줞:2vUzS8iRZ\"JıԌledW6ZCڵ .Rk f>K<*arX9Nbnhy[ÈA& 52ba}0#$rT(O9%cL#u$ȵQE`>7`swX6=LpD03$ U4ሇF2nVukZh5EŒ=M0Y57q5ٗĎ&1*/)<|1{sƸ۷zWW-li1>b[Vv|IM"\>,$j ͐_}|XB;3䒴':iE"khב`oGw_'0Lpgn)ƭټ[;:jpaXjrɹ;b98DaJ[_G |9^Ǥ6!8aЦgW[wvϵ-"J'ý028!4r2޹C:>>CtkMp!>N6AX_3D>m/x+sWBCǚ& G`Gnޭ9Vw-q漥5< 1ZÄFi3oQi/ou3\vGT!ெ*&tᄿΜ W¯A3 HDQÅ"k% C6h5f (' ,݄TrTNT`)(µD…UQ**JmIMVIm#+,}e7p>]Ω8~IJ5/H9)±% :Pv7iD7RX0UǮC%sj]BaLzbXPi|̼t3{mHN/?KPdt{xPp-*c\Z?b4ℽ)~2Cv59`~eXTt]ެS- Y]!!vW8NXB8oG3k!Rb{3~4EϺ6y5k/LAh*vf)hϘQ^Me~Oi nmX3eTķg߂kW袉w8vz[w^ .Y9$!'.I2%qڭv;X'hO%W!j$E4I0?p-I}Fv,nkZFj:$E4I>CO5-%1s@84O5/.>gPiI Zyξ_+A ǒK%~~B+W nϿ+ GRʼn-3ɈP*҂L5fl#^(ֹT9njueY)l=z9E!sw/o[DRK/[ڟnQ\3q٢߳`=堳sTCUIE-;8%8[hY._mۺ/g 54YPp=N.f)=q6靼/e7qώ*؜WG,)F ngQQ=+ %5zV EM_fOm:&)=;e9m1(p!qx;񤨥` Pu _ jp>9/"_]wdK*8$1 %K'6 mRR <7H 93hW PeaG1otG cn_*VFVm)_̣ϮnɨkL1( ک5W44 ,ČJs;½ɴV YTxW8,}|FQpF1ͭՍDULPRK`PU wdЈ3r;p#.r|#DV- K#)1ksL@F9*+eV!1pLRsC,vo&dor#II6DN$٥+Y-1h(sTI WHUaXVz쿉о*h TbnC\_dAeN[-T Kup1B3 )󦫦 dȒd<$@Yiݙ6THjsK3aj _ 2+j&E옒)`G1%wbv{r'曥w*e[K9?WNCΚFw(g2P%(#eFItw->zy7]/y*J1.'(m09z'YcJM+&rL1)9D`# Tc歜|o* խv2JU7QWiZv_{l*Wݤ[,+&jNs ,0:\r'ҩ cH6«UZGD{udvu9 -S1郘*- <%5MGCBN\DdTUGxS ԉ%ĈN3h7O[ڭ 9q[(oRVۋ^EzOs>^_< LlO7uhf3x x-T̈n/K' \CJ,m(B޽k (_A ՝+-`a =u~L6 ,̓3^#x}x)R{({^eɤ7|pad#NfLX.&2M0Tzy 4fZbe$/+S>)SKvoܙW+o5rJif:Q$+|=Vキ!:~|2|)sΖ]\]7fOz7I! ˩:9Y)kpC5HE3kSE1g@}t0±dd{X!Hf[Ik3\eFKDwߩUx(_*&n4~rBj1;m&@*)WF; @˳/"˖2.- lQ^#{潻{8j~6YVNl Wh)ii!.mn|éY@3"k23 p9 9Ok8==>|ksH͐wehRkWk.#҆g4|>D^HHm;O(PrTM2sKcT!Dʙ%L\98!Bo;dT( MEL&2ȾE|4Sdb Zk 3{O)H)m,'.9eg B^JW E;?ޟXʠ{M5!L6.; -T0ŏ5G FD.W#g Y󠷸"ؗaנ /#3IաbN̮?-@rR}پvf.CO:,0'be3ɞI1KdI;ރbIϱB b {8:[FDX"ˡ&'9JXq+5hbPa&:@acV8g#aXA9.SBT :"DBsjvҐ\hŅАa5b2g cp"i@{Caμ3mȇ ͗؃0; A8^H}m/?*^!ߠwϟ_.x;vG?ã  _~~{tOW^SxS_@Mχ~g4G槧_F[S-wvq? @ﳚ7}qM`S.* FmCGX o@ыZ]N}̴Wt'^|۰߆ͯQR28/#̍#T" NP`Nd8>oz7REiC/Aozdc_Uwn޻nK}U-PKI'KI|\)z;\i'J{KC?I+_nY 0fqJpycaZ-{ͺVWPbzw~q=>[oPApw]UpًX5)03kz¬~:};j侠hef4v;PMUHTp4}SX/чaw.U-+z-QLTdNeޮ6m4hbT̎groA ?%.H#U;'0h +(0Ncy*aXoثzwqc-WU ]r:XPG _|0UKVRNPWkⰷT"0C̃2a`q*7lrs♁Rfk u ,- œ.WF%[%pLe;A1B@6;d{0۳, VWAЯCg[&!ۑY\2UmY B2u  fτ`RJQ-n)|AzKcJPK] 9v8gCBE|fRpJ͛R/^1 =m5:̀"@ uZ溇 ֵ6!CU~PbXp֣נaR5iN% B6 'jG̘Z2ӓIJV# isn#e}PJhiFƆhRrl%ylęLS?ZdZp!2r'FtYkn kƈ0-^`-@jsg"XB!ω@1jgE$K.Blq$}Ei q> H?ӷ#뮓 ``׊, @)pp0K8Ak"-#UͪA@HJ=RtPM;|%}CAyd^pV' ׄ'$ᑒ/r\P˵+ (icՇΕz9DABMBwI/GPo&n&/GD4^S JK3]D"f.b(${Tn2|]b_|HG̗Jq˿%*l R;a9@1Ú|lZcz: kM}Y} uH߼ #ذL2Mqkaٝ4)N.>Hoʯ`zSw확65'Ϟљos#=q{pN.2.{?5[P7W5xWQN /&tvR/&/܉5CX!B+}$i!fpbP k%Ԛ"1Y}ϣGcr!7/J.|෋m~UA{e*#2c5~zK}}w_Qx5Th@a ޺p~-<`Q]5[pƝ3.ϰ62#VL3j,o+go_ ,01MRnZ$Y;ek翤bԫbBx1?{WF ,"zr{Fy^y-:<O$xIY̪"d,fE~ ˵罳k#I$oM&GYd })Ӥ#օ<!pqr`~_o|`qBpjFΡv ѹ)R*=a~b^wExo >lTpVƳDZ68/A31;D I~MMvPUw"׎=M~NsNU EgQ ʈP&08xp@Ied RxWZ:}RunIF(tL{U XJ%w1N RSŬ \%8HA D(k>Jr{D9#lO$Rvpsތn7`Oɢtfb>܅uV=OMje=C?<^j JNqҏ)no?k7w7e]ǥDx}oixï?Gܫ&Owr}ҽ ~ٱ&Qxߏչ.n1iD)a?&zIjufaGoǏK?kzk!<_{t.h2mm39+fMPqMQsQsXE)¸yS hIr^Sڋ ͸0 `/f^ &Yi 5w'4sTby8T }8zV+͜$&%ryA#nrvq`qi~8H5%F4kC)?/{V{qʐS~7{;jSw ~8kuXzWcGĖ]AXyY1eН^?뼄j}~qqOZ'F:6 Cc;jSLS%j/qMS3nTǙV sUj_^Yρq%C Mzഭ]GuvxPhX1LDi}6 In5ViBHTL{m1SI`&oTUY@nE7sϹv[+('aȌ!T;ЂPV i% *Y"lTxP պfٝ~4=r V$Eغۗq龽RZt4#HpRK9m~;MP-0;F(+dQpeU .; ZxC+ƣ=XFmj9ٙ0ױRǮOo7qŢڸ []:bݒA2sF.X$yz.nw8ʘ6w5`3ԀhTKdI"MnڥIBQn#+N-Z#nǠH(VIԥ*cUnj7.#n zߵwTi偐]զ93ZC3z*{[U?5"BIqWԌ'n[w7}gU:0 PS[CY5VhC9CorEe'oY"dGěgU8t=O' =jسQT>+J|şiQ:Q%3Jܻ!/ 9#`F*;Rn@WǦijb)ND_]JwM~kDYWO&XyWRH+tAۥWTUvWMXf:F!~5$ʐa}AJOlզLw]mD1Ne.c'?+Њ2BKzKׅ\ N+຾u!^*zU*K iX)hCv!ʮ Y//Px`QAeu CNaH) T*V'J)$ ѲȝJW&%"BF0ܙ/*җoHTj2ei:H+1N@ J督Bq%scRzj .2'e;}s<:e̓0D Q-y|# x qT^ pǼ.%ߨJ gՔx RL_F.vhDg?q~x%9/_7$( kX >)\R!YlR yF !u0K%e{pu;띇J_]VAiF:C+Rۃ۫RdR+q1?إBOgM"y1_J5"AA6ngCu8s;t A)0A7?dlsuD&Y"_ Je|VڒJ(MXme!(Üֻ_DT]DFFtdµ_..bVqrux9nL:&: mD1 Mva袺3t2Ϳs{ޱvav&Et2ڳ~$V:'ija4,qeYwKý4'_7ƕ{gn2j\ـhkrJkW1WXjOi-j-^2hثgѰn|p(ax |N1];EAT6;=a ;v"14+ED],` gE0c?0PRڮCqA[jpso d㶮xv%O$Q~ )[WWqd8T"rZJOk0 i#,fIʭt_ᅰn<֕b M@TLZډ(q2 QLp*޻H㕠XA82 E,%02JK2ªpƫ S9iH 'Л۸ڋ{pڸfvk(_|3-dٝn6vM7 M Iڢ,7T+M`Aq%1H^i\p1‡ 8)6 tt3KTETsAu>q? =MkZ: uGK0wG>ʍ|oA5|KaH#xjF$. W_'[ >fG>!RDJo ~)W"`A5`L3TȧZ65J'b  #h:A M*zb*O?):T8ᙶpp_J1# IIЀ6G*))(1Lk:E~ Ĺ^S褑(zNw@c!GPc G|t^qaWke4I盏AHSzy!zmr( {¶~۷eYyAK+ -<}oaRR~(wVY8h˷'E[8L$L!Q$VΥC @Sj\{ ӻei8G9 pZ+A^TD{Q 4 [d>g@/ BYRΛbJi@K!2t55{V <'ȓ 58E.L>Y"n6\]gc7wd5∃zqw{3M9Ǔ\-(?eg/9of X>S @d_ŃB*s5 gZ I;Y Tf}5cAP~\!(?4o )Of][^֢H' R*i߉iT @q+i*H[6UdA)-*(u8ӱD{OTPiz3OY)'q|oüWrx|u1~zQiu8z.k8<=dR8~J)\zn&,t?.7'D!4fpWwq2~n<|8ǟδbMQGV&+PYf fM~0|N0p;t9cԒNk"l_Է!x:$wb>CL&G9\LMIӋ@ETùIs2JZFJzUF\|p$([#*V6eQSS0iU>bA(UXSUEj#t&ϮdrܭͩHtyU@B>+q 7G FTA4D01] Z؈xؒjdHdpkp_nYN F]&L|O;>,رFgB?io:8o}.ͬioֆv4ȶP:χ94J✭4UTV|7X(sT ?ŕ [ZWN``.hT<,:+mJ"-LA%ר;D%2FooSwHeQV )bTggKB!VrL3@a"Wx?cƚeM5Bf.Ӆn'ZL_) v=CPn `EH~4)(kۑ']bݰnT@zEfP qFRv|A+<g\[h2e(km+GEΞ;zY`]t{z^C&e$'[dغKvDS,Vůl*φoN>G$ )5/rVMKA,)]}~v.gSw}yڑZG_̔tcy:Y~CU?6>Y1]-^b"N;οmz'vvz?]{~Xtc*cqD\%^C`;}z{'CỸV0 #ɛxWķ*=g;{w~DC眗}ʁ_8XQ !M7Լ}n hRg=zT]^ܽ(g<;v^4Kmrbymz{f9`t7CTo:qZݹjs+kɄbIɾX,csťODF7qYĴн?TxVϕo14)e:!A߇~%\N?/džzx:K"Vʨ f8MO(Ht"{~q;T4ʀQ&#+B:4i S1Cߪa̽bW*w[T1hT$FlQSMWLO>%&Xo eѽ/rਸ,E$AsvV,Xp&գ¦CQF A:.bo_b= js^_檶D2pָbdw &,-A=J6ZhĤ͆GӜ,lߛ:v ٙN4 8gr?gd;N$*AUֽ\tyꁯ/п,g'd_3Sd{x.u 7x1F_';vK$ n}FΡ~3pcp'@OԌc"CijHkDRs9|BSB&oqL ]sPywCZU_; "l(CL[hsW6y;h9Qd|v# cvcH^:1͂c8fދan_ J -~ylM?iU`-ҏm~F55#wA-޷|ݿzx Ɉʇi=1cTqך8<~a`(BG ["BIeRa#!1g mi;c[㹚]0yꀭichDny '5cÍ^ATC' lr`+ vx 88>T³Nӻ\U C*d=7dA>Y4!C ֩&'R QC2Cdм SA?E Ω)SA xxdaZh`xʠ Ti&8+rG|p|GhXkAdT=>99je)X|sAJ`Tzm\iW^6D,l*C, Nr"*EP")PN s7yK /eYa[?Bmh}DEࠃ*F낍V |zWTT eG ub3w4dJX&S:ڭYvd3`+_)׻Y 2_vػʭf'uӫk{N1B1EUNUPJMm.J'BK/{͘̚GvցEBآ5%U>O$mJ/B/]_k͵=٢+;Vg #RP^6TWORdO)nIfGaR)=Ky 겄UGqb zdFxNm%I2fA)&MRTU + G\Y(E Q{br8q{ǎzKs:zw^ YB/b@YaH- 3(1˥dck: nYbcEH0b=!UtV =j 1"]_;@7xqJBW*%k =#.X[jMVḇQ%:@R"0',`1:#3*C2㡇 ٢|^{W(,5_?kP^x.t+WW@#Tg9iǟ(.5 t_=M*Yշ`b,U^gmP)>2e^Wr+$H!Ie޶Ao>%$ ]J2FFe(]($4PZfY$aZ)Íy#sV)&Bj  VcykX:٣8z*Ϩ Yw PԂjM8vVj>j]zY>w=F@c##DZ>Vy+h5Bl5WͦQ,pUDKwe{HnyslH%f_2IIK=EҖގwgTwAߗw]Ngbjgbڴ|U/FWjhR0d,,/}yT9]0ruB<}DU)QzRRιhpEOQJo~ Ǡr2 xWi8ޑ`m0.kEmms?[UȉIbU*#*}鹡w{[RA1S %',LGLtb5De x|qVvإ/6h>i}\Oן?' '阦&5g_|6uW[YՖηnyw<cyB'LvySt(+ZNVJU0Gi/xsоZϮh˔ZZ˭ D*EIFc(J>)TZr͵nH=xn%`!1QĢQTc/% BRhom6%Mz.Eu6F- ס O)Ԣ9% H[Up޴UZkͥ6ʺXn*\$ȹvZnk$ݺo[OʛMKZw`~(飿Lޯ5!V&"g_>Mt\m-O7t~ ߙ꿶ss'Ɇk,=|'tg4~&s!]?f?=5H3HeC>Oph #8eQEr Bq1J,oZr\L=P2Bwh$%X"SZC+muj x9J2*'WS;evRewJtڮT+-\StyHnjyq@yMT Me&f=W_|w<^X?~wv~M}p#zXoqH]$.nU6O!H̭鼜$oo2oU A {7^!(!>~ZVo%'F'בc/w-dǤ9j4 c'Sb;vi|*KdZ<0+5pJxJVSgaPcEpò5<Tڱa/G ,'Ăq ,%%C9ri3?+A+J4\{I+:EE|'5Gejē=Ϸd@WmV BԽmTܘy`_@ Kt$Uk7KPҵ%x *O~ElJP{!Lf38Ւ(GvZ K}pZ066':91F,Q,1$u%ow2" 0ӔԇZhiƟZVeCf va D"KK3i+Ni`O~r v]ݬ|ΝA45 ")KoӒ̕g)u%pJ*1s$V:&_KTRlZE$cJ0;l4-3 c,`bb.Jr !rHCm 4%+Rq|Gdy`OO0!P]tĩkLfLKPRy}5rcEdU=;(yRb4WY`\k#>4P. K|` 0iѺx J$He' [a Dr)ސ %"(!.(I,AGNct)`E WYZ֫RMN 3΀+څpx%qN @UHfmځ eB+qpmPcb#^v.I>wemI mC|̮΋:Iؼ & O @:˪̪̯6K5,kIM[%W"srW(`GL!.#B%+Վʝ)Eb  }ۡDK3BYe 0 a42 gZvRҼ>0Z+|Ru5f!c,}4QC4u\o,)DaƒbroT}]a*RUS%B4Ðn'sp"]eYjDմVFV[(ʄrsAj%P 2$xU,ȹROžⲸ0_ܸ*,b5_g@ 0 㡟)f )K tq?|}{37g/Ͼ5/$`8bqѪ~֨HRbw4o!%Y,9կwjޕ ѺVԛ |𗰜|8J/dLx 9 D/Uz vE:s03OogN -N3`b>2 : >ur$yа`l@\  OYX!DCIwM{(n7PZ4`gIVwfu%m# Țp׼v#,z!C+xBdS`\`4uzN'4y:]4vFRxT!}ѵX/FOA5(JeՄ bD'C $!qw,L[a HvSjV%H۲)Jgosb  d~[…:}X w=燣 Y@-c7s8KOSdZq/r%H_vqmӥVM~9s$ XPteAk8g,݇WHs5弴(UX}`Ć˸zF5n(a_3Ռ!8ŒG"dg@FdXHQE* sif§/Nuܭ4-6eӟtl gp@@+g5Ҩ'}r5ٕ15 b<.h!K9|R.CS$e"7q"#>ĈF#.QR`$LS;b((mwV}{5~S۬܆¢v[FKJEJx4.%FF#7B92H,=PRC|&vyw@^4-V"^20$^je*|KĄ@pPZ W8(Z~Լzxה E5Ut}f>I"[k~' _-Lx sNrŘ{}A:8XI::5¢`SpNfj tV|wGqy3O"_uftwws67NòbƇh1֍W8hݺb3tuۣd4ֹ֭NcBBpuSB=Z hbhXh֭sGs[#h1FFbAM3A{4()kK@CJGiPKpJeC'6ZVvx`?8%W}rޙc 9-`\(M̮-YHL%vO;A+C{{,GV&xUß#+ V)b 2awuigu彘R3yX=Q֣{ݖ PbC(pfPÄ_6G|P+JaDP\#`,\99%o'o>\,f%^:Y k{PG+&9/+j]T"YWމT.{ڗZ6 ;)EE)Pfr2uR5lTs>ߺfnm1x} `%>DӡKX=GOw'ʴ tau}Z1FX sB/DU x0:)IŃS!)!8_5GgZ˼h-sZEkvwΫ)AW * qwd'ҝ˄6/FrP}HO?_ AX=Ic$Փ4VOcu+z`GP8zb51K"a7XZ$'Ly_:FGI8ٸ5/,lrI|珏n@R)h15m|wEm]նY2`RX%:l ̇2!4 6ZzoT{.ӗrqFM2y3R}.8)IkCs7T":ř$#;Yؒ BNw2d)% dSK!|0E $`PQ8hKY&3'KtPLQPgXq5𔡨DD1Hp=J)2.:) Bѭ>व $%U]/'|[-XW鉿YE;ˎO"<-yGR!Ȣl U⃇/usDU?o/kݜ$I΀ZCtד?c)|ˣɹY,\nAv%v{ \ 5RKDS0jOws-52p hB>05JUX :Weo"bj"E\>\W=If7X39&Y.baD.4`B#"  eVv馛ׯ^~dfJQ$#2׆qa6*k"`$9Zu]ٳcZM(V;ر[/}>_C`U;Ȗlа`[޻]aD (vځBFf^tO|N =tf/kv!VhHt/p-KT! +AK[ mKJ*F"T\\/*XxqU'x_o6 /Vg_\&PՀ{70/1V6I50/n[kir琟B9g2s4%?{ƎᗙYL+_e2X,%Dߎ ۷ض-W9zIl.~*b}1>Xy>us<JZCERc`Iw)Ҕ 7+sUO=ڂQSǐBȰAR1HTAgi2q l3A;<sߋwYVF">hFbpHSMpDI/Q85BvRpT|<|s{KWNpG$sqҫ 16x/5ފlP,LH0 OB5'8d"%2ZPy0t0RDLNapF;?Z{xRgȉFsx82X&1 zƥkF&IDhF]&+-P{σﴑMjG:-2}̱@ *Gi*&c~fYXnNP)u:Pu_PL"1Kds "dLSkSE< ϣRb"nUO1OuH *4h@()n)̀Y boAjmD\VvߝW3X; Πu-(l+٣o" ,8z&d G71ݞ)Mj\q8LF c!9ϥo 0\מHTQ*R\rT˚$°(Hoz 5UsyyHc;FXBP\S aXj{  k~*P#fm5ikAB8B}vsxHg  B]SIkzx/˻;2[ܐ~,ধ.Kڸ3 rWf9yI-\~,nch} %&_A]mӥ~R/>&ynWBB,uiiXpa 4펜;䒃!\[i#,y("Kq h!N)P$4~j3:> Z%'U!TDPX(X M!ֹ:0 !" BV#Q(ػPU Vr3?SItlcU7GrNzk1\{w|  Yim \5d`HFY3T7L &gZJ-b{AnjcqXE ٱ=(f'_{2Bh.'[x՘hѣXRx0Nכח2S_8ӢS6J* *yq!5iڭwYn3Ώ( CysDP:#ʭLkҿ-Ub4"/>?50 wrƆ޺4擷'pՇ|6=~ΧKD6]^lYƝ4Zѭb)BKF$k7*>hhBM2Q#j B$A ~(=lFӇGwdbhF Jf5YͳޓG{"23L”='H;bmB^vVl^@{Hc]~G0^ìҁ'[yDmp:"/mXr}|dNwGy{T>]&NMH$#Zƣ0x=AReMU >08}Y0a,D-\=RUb))4'b`krceiW&N*i9=vk[NUWjܑDWzJK+tQ d ա}u L&WWH 9N}UTX:]hz@ꪝ٘A=M |ࣕiCpD0Y(LK^lrY>0tR H \uyހScឞ^ Fdkݺ72 O@)ePBQjcUf\[*נDJ#2pI-S %Ng1czLk:qGd=$ IsE?~ĺ94h@٤j/C9z.[?w7<32z̡P(J`L gGN M:{mg2Mu8]i"pn "96 LLig8r_Qh;O)̤eLՂJB*l0ce 0WGVត(Ŵ. "`J.*ż!; V[K  31 )cҕvykQRƭ |Ti p'F pRXI(zXIRo59L,t$Ÿie %CtP:TkM4ct>M$GDa^m'0sY"B"$cQ!ADj-#"~{ B |8"Ff`"9~i0R=YAY-=Q4 8=H#v ?ᝳ|x3Δ/;Vhl%N)ޖ>kKkYMgd7߄ xi5Yd&z=V7:sٯwD铁ZٮStȲf>G"OT8EPS &X?lGpt ? Ϸ(hDWrû.<#@Vt l =1묬.[J!|:P;>D?eBU1[[?XxS )IB XC+7?0˃Ĝy8q˜"ʍTmthv"*NjNY3}Po"'{MRn T'Mo vD7CO=̤U%:byb;ӹw;㙝 MHZ /qNV[ޅ M ZF#qbH"[I`,!WI=m-C{:sBiSCSa;0O<2am><2af=Ą}!M_l-gMܨkk "cmGz|t5=;lK q#lTYӣ+^CܠҫF/І+5N>bjPpXwpF ćNACzorVX#OD1WIѨڇGb#+"5"ZոJܗalrlڬ3(m%&"/yquEAuS  Ä+)%o&1sa"`! .N"mbRZAAp,FRze騢(J50ȭRT6q>URnW_땢t Nd{)~ջh`WdR·k>ODWf9eYcqFkXIjL2;O=R\N[Vfv'?%ۡfG0*'};OG9Τd!B뉉DZ~~ jv#g֋T]۵]hh_j f}mֿ!=HwŊݨU/<;({ѹ[]EEw\wTugɗYrGLK⯿6FZ7 ,h^XfMYa)6>BOx>IFq=2n}t7g6~%t [Yjx+Jilh= _ .NY%O*y- )<W]~۷?{i+P,(uу|1@+ g#X9zQZ ~qWx(AՑC8hI~cV8YI;#~gzF_V/=: Nf1dwщnCʙd率%NIgEV_uuUlWͲoLk |yZ>3xPuDor]!ڗ Q&QrʆnV*Y .ף?v sLЛ?E,n_)] [FY}&5dDocYn6#g.W@\b *0g(L.ug40g,b.?xj8b6ccYݫ@/A' ~\^'>biM܋'fv{Iٗ^4_|Ybb\ { XxRTvzP˓OUqԐJU霵Y“^;GZ%&|5/4-؜/oZ9͛ 6/}[Ywm.@RXmbg A͚f~y}y6ͶbE^z#wno rluLŤ[H"ujvr|B@²K99\{si%8;yf0q㐷cMkքUJ:<%!e\T"{4z}`EZÜG 2+V8`B0z6 vΊ+Yv_ThnaWYφy?#>H P3G`mz H9Rv:r2tq`Nm9CO̫^b- CXO J0:KRl Y<d:8%P)Hbs16c)ߤb`%3lp$^.ӞeN 8J( Ú9~"в ӳq@eZ'5ukwάsaa_[ekQ[QɃĴn nlgUKuY^Y=7e}bEA@ <!ГA8X h Igϑ5b^=ߙq ҡ\@/):{9T@LPY[v&<^yaq_f|)/#+ʚ 0X5ț7hT&ﻒP 楕Y#+X u#BO&jo"amR }[}\v/Ec1\ o_S?JkTyys+GG[ù(1jt{Şdžw3B*5)\I/ 4tWBhA {QWGQb]RmEuZt) ;CײH}@ 8kp!-wsu &$ 3{6W=(i"Ø7WTe2wa\/FEV 2*U?[K ";saGM.;|+,;\6T=ʳ]QN^KW=*®#YM9*kW3$².6 sHyR<[UbkϷV(Y]ֆtw2+ C,F[j;V0՛ oՊՙlB4s*8|>V}cq*2Վxc{M7朂UQ1׫o,Bt g>"EřbVy1Ah%HOµ8Zޗi\aA Hΰ-r$ S\s20 :䣓ˍ/=IV4V(('ε?TVi+M+Y. u[ǿ:vѴ]'{ {F{?8Pp2!"^OƟxfhmTQd,– z qyŬ-diK#Czd2e!⮎'])0(f~ߓE 'wYyϯm/R]?}nP8! 3. PkBHÉO6nDύ: j5#;P@vw Ej d:Dm 0 Bp*( RF/C6X.7CNd2%>T?t=*A ɄRdۻחZ aV·=TfÑ uD9@r!綫( CrBk%!?B @\#.K$`=h׫@]Īsv@0vgbLKeȉN380yVio?;qiքUJD¡(%-*Å\T!pnԣ[L4$=flFUr%=v[nbfk[9/῜ xH5͂Q o.YlIŃ%e|gny9k[asYN9.g9|nK RP2uiTHɈ <2`ϥPpe,UBT8b#u߿:$a ?@g B 14C'Ba:#*o9ژi^Jbz[ B`gZڲ4we3/Z) I֢j)tXo[C8wHsa\Kڗ3H TCɺOQU1 .Z!j3U,pvoLK9U|o["\ Pl\vK(4]F^8!B߂@UĴKa ;b-B(<=~g$ PE$ќb j"JȆ9)l,$S4M34 VhPS v!QIs4iJ9Bs_ n=]DFSABB )AJ" &MllO%T?fͳa/dRܹD`k`k$ t6Z8ֆQ ҀXă R[|ԁ!&||MGڼe ^[MâlW )A e(0@3P(M9 U`j p(0>">Dr;J':G3YJk"uTK/1-t% ŕPKM > h(yHSRV %#Fh:teyL]4s]?^3Ɵqe' &hqޫ5vD˕S;}r6]`vN@̣5%GP(FqS"!_~}40VhM~@ѓP j@bn0@ڎʬ7:DSBJk$zɵ{W6B/3Ⱥu{76f&k^zQl%JKvD)(w,U^U%SC˪&e3,[ȌQ8)s3_2^rW&R-:If L\6v2>MO? u,Mo1xqǏ}pk+$=\+&M^K4Meo/QeßsO/1~Os*͊S^_⧴\;<>Fx}F qJ?aA3T j>YwSl*sP) f\$heLsQ: &1jkh A{(5V8dsZQ#V)1ZOOʲ\" x+s+Fшb>TZ9}1`%`d"fo,&?cYra8 gp&83@t>57aQB*!_TP#6h$ҡZ01d- ,ahU/s\ـT\̌'9J|LVϼ5 hrҒ]unrR{lR0gieSq 3=:24:%R;#1L'G7OT"Vf\(0@:֑&T%  1!^R:\|m2auz5=y`4:k2UMZv*ۋnIRD\40.T$PaPXz0n8­W_KWWW*5ݬ"U:sA:LpR/pY& ]Fj9!g{W=5%;#-LPavCTOk)(Ȃlvk:I K2P O0[ؒtęb}w9vp:PIʨ| 4TT ZȴH lbgCr@BPO}Wlz5>RjI1w=y27#$uOGB)bxoon'u?oA=rZLC笎p$i Z=:G٬PNtΡy_p oLOȟ&<s#BA36>}9vm-L&]3qMgw2z7&B>esVJΙFAA>JGG{q^d -#=.X0^ V%p6#ytt_lP%Uaȹǂn]r ?ț:Y:-@md#L~au M 77]#,7E2lۮQ df &K~vQ}Cv#yZ~w .g4ZcwUkjPn= ؛xe F keHe{pe{c$Xy_Y'"pVY`6w;NスzPI`IwqR5'W5Yr&r1{6Kv6 'jΤ 9ƒHqJf-*͚2%[qЅ>QϳUa 38b<3ȣ;y\vJ­\{[\RCرq9|#}|$t ':e%:  ,rK sJS5^z obG\ݗ(uZI- IBjU[O{/f#Rg%H'hAX-ThCQJ%1RA yFHQ1%QrCyT=7{ˡTjTP;.}>W2ϬP@hP  ۔q9۵xnh ,ԃS%,ʉ3P/L ϔqjf |u.Ym [G '攠[<(/ U* N4*w5y C$N=ō1b7ɩZnK/Ǭ]TKt("L`ZJX7+RCjT(SL%|xOT_l <חJd~Hy|R C#SoDdVE^cZmYȥBܒ `UuXI?{_Mԏ)\jr7w%qyS@pPX]!ZC,|!]=#͓t\ Д %+OEFlШ(FׂZ/4c`Qoj(KUuj0d5-Η "Ek?F;-+9(>>J[f p0"DGЙer0`R*Tq85A`%(Jo,=L1Km$Hm3f*M]9a^3 Մ>ϯ%x'4\-߫~ $F5gP ')ڦYϜ:/?M + )jZ~\ ʯVQ<#Fb[}Qׅ^ ;\&f_%gu7UTRFXj5ιHf]d/d,봵Λfhsm9ڱ fd/;5X,6CTjo11nPMBޏ36YtD3$: Gc84j >cfavpfPpѡPB)N-5-I,5Rk}})`\0cz(-*8IҎ"5BSK+g C/ǵS^ -̅M4ЇjP(X0A T8{?rlT!jKca< Hט_#%#B311c :J‚)6DRPY#QjkNsEs}7NmIL213k63rZnFz+-Ę1 ZRϓmO'o>GS8t5Gv{~φ1N"J[H޸3(Ay'B0 %)жJ eK*PlceE1f^ZJ4iks(Ѐ [o@pYY,|O!Xp8u5ټ sr>׎0Ylن9Af [3뉆d&|y+ Na2ϡoY?=!Iܔ ৰ0*p{ GلGFvy& |@vFKe:%grׄ7_ZFpv,SvH\tN~Q~إp,ZhBDvLK?)~*@w yzLF<}l@o{DZNIj=tt~G3*C鍨 ]6Cc;+ HlFn5_:͜mk@~$UmliøX/^bhz4x/2Iï`SP[g/2@yOq q<%BGl jv\|sև2 4Gf1֥Ru8yɄA@$Z%9Z|JC@l"# UѦ|^?r'U.A@ֆ!4ue?]L@ dJ E ޱ+9Z9BL`]ɜ »2]]Y0E@Qm1!~e]}CҲ̗ 8-.PpZҧ#hrHy^F@Z Г6=9+}Y*k:)ե"RB $lOxppUZeZv1"d/_TtCy2W4wn"TlA_f7NƳI*Ng.oU x[Y4Vg:RҨ75Rk5#2'$qoGS\-_ʵ01&SYceX1pnic&czE >Vuvl]8EUϖ_nGq\ Wc) oJHVyAC%Z)F1g{W=5%;#FuQ!T+aQP~b-9mkU/3I 'f/=EuPPizkzef\ܶB߭ʙ֡rx/@#sۮ_준rZLC,Hp׎ppK)tף2O̓#z[ n|0U68Ӟ1rTϭ;7=^PB}}vt^j5:uqn]|of}'eTjOL{C ,}=\8wRhXUn# UiV&-^F;T0aECk*^|^ zq߾'>1Oݻ79_GDK*vcR\]jKIm:-YN:a h"۰~?9wX?ٻ,H7^5CAoǚGt)92b)%=F?o/vAֵ4?;KqAɨg7rI[imEQ5v4. B T%i{ڻZs]>wLafkE2HfrHf]$2S0=l;6 hu*HC߃#ȒL9 *^Ȍu.5J'ءN?6҂1题?_%Rѵ->fL &*d>^y{חW_fC_'RػO%M Hv/&konF_eJj$U%lMmm\*hX_cH×4Ù! ]."AL׍Fܥ Pa$:wlC>x9!]hz7s7 S~L|pxgEPj^xtrQl;f3^yO"eݻAtoO::ۦux@ ȃȇfA>>?+n,4F-gdKK?$/zӞwsw)<"Gp"0;3_ջM$;?f]`\sq.E8ł`+jgΌ6;Cz*`؏w" O?Z_wc,E,(Lgao¯AQaPWYlSPQݟg|TmPɷA%Ua,c)C΂ qjܦdKhL*M-ȣIF0w9O:N4MduJ}P'z!ti}?3yMj?|ܬʇ |w4t(\?L*& !ylg>J,[&=C) +拋KK8A{=#]YWf+Wf^ݕvf/y͍+) ^.6>Nl~ucU NTsW%Nsj({>~բL­v[(ޟ`A@ZfQ IUs.yi+xTIv_!,8Ne㔊mby7NyS|_t=/ϕ#>.ŎN^$k.()fU ȗ-Op7LőWLāxVY!VJ&gE^Z7B?\4uV3hu!jS|Vqs OTȭ&Q14NFSxo~p?5FW+IqG!}=l$O4S4qhٲ%"ZH5̨M g٠drgZ)es-ĭSL2Tb E@`"Q $=gc 4[+@1k72VLp#+Ζ< \4*WcBMŲ' ^U蒸+ Rf-ݐM=Tqr*^qnf)|Ph%l¤Elx$fMmI/yFYCMɰߒX~HcV*&S44vȶOiՂ~Z~PI PIegh~vJh3EW[hxШ&YȨ7Tpkx31Ѻ]m'ye&Qgd^_>G f!G)p-M-4 EN;̣۠qҤhk~@AP8e(^C2/#82i Ɉ$D=E:;WDURΗ+%P˄ &U{lh̸4 !^89KeBLh+ƹ)IL G9<9c)wOe;k 6y}dJ 9RUw6 0)I"ޔ݁74DŽhiG+n~Uzs&Yr;wkD$5)prR٣`jmOwlO&\)tJkSw8ST* Gd-Ss8Ap#:juDkr -bBr^}d*6wV1VAAt*Bhw0Bu. T0.gdRcgS*E7X JOIHC^N SsYqdKPe$\;AȫVGW"63fuBg[fSU&aU{)kV#/JF_mYÆ$O.˔`cFRɄ-~<;HL'ecXFa1Ӝ V௃UfZQp8P6aX`;+XvYoL=9ӆIwLu,id,i D5pL.‡"7(g[dF> Ms8&C(W4{FCb=KiGY-ʅ҄]]7h}D\s*w曤D3n$QoKL" }o҄3J12fұRƔkb!x"M3hrCc&&ZS LDbM(/" 'ZnQ;p k?wA`:Jեv9}_PS$(|ܼbV%@>Y1#/b@MQgzE5GuL6f/7hԄQ_ )0]D]%z9]Cr0IuYm2XN 2SlQ^f ] "yxb&lb^n8 (^88|(̇b D(#wb/8$R'[ &`"acؤQ'0ÿ>.i`rE$fB4]q͈j-Td*񏽡` uYPfj, xYP,GV L ŕ>Z|7w~w%f|`b=`3U?Mꓷuy nʈT@,+oY1 N8y^|UMb &2&a75Sܓ60d:}2lj!)^eΩNm._̃ο>\~5:77 ^t=Q<{Q܍G. xO@"7_]3}+}w7yw}LLufloN73,2:~>"weh+0zNPIa4|P^5fIO{ܶ d=|UZۻuN.9t(KP 9xJ[p뙞~,/bҗ]ctnV£$uzU~`8}05`~@ ߠ#nF'ǧh _&A8/)t±n$ݒ!Lڥ49x~_oߜt欶v̹.&=_ =P͞W '_Kzrº:rr9Yn{9Ift\y{O\o=;c«Ib}e99Y_L&ˠSXϮìꇋ\dw$o_|."OA_Cp~{rrs ꉓ-9tpZ4r'MdkM馋IFsl sX'}n'[ЕEsٺ2\tm3]9N_Fz1݅x9jHQ]_1B ?qvf6[|q!gz.m.N^xϵ,f9e>$FV JyBIfɿszj\Ӎ 4MHz)NT9w`҂E=o5Ǝ3m*0$\Uhj\ I*hmд[x?S=icD0&DeҔύ">,5ߪP[M4\ootCi5[UgNŭ!"ͅ^IOK iA/WVѽhHE RpX6}MSHOHO5jB [(' ;(Rb.a*%%z=B:j+ U{=:t Rǫ&@-Wu:&H)e:FGTQ$uLP Tk1I4F!k$ ;mA(/͵CQ$RX/%jɘH VF{db9 $gqd&eW|3q`ۘE l,fmaka)@ +q%B F&@aED8< Z4IbAY^lcýY_v=σW~Ks] }*L{.@ u _I c]q?;:^lE`N% Bb7XZc]8ӝr#R &v4|}MoPcZ˭&4JAEtuK'YQ >9otդ%3Me2h\Vi5 ô'jwQTj\N=۝jEk$U fDgVv S}JRHVJ\ZL +")9KY_ 3/NN%/a\XY `HDgANA<#8R1 5PeįI%lεgr4XN {&z2Ҡ v:Nnp{(s֛e)ּ.yǐ! ;SJ{q}[m Z ݓP|=.2+!ƽx\VDT<. io˼u OF[uaمkm ; {g=/GL@gェME7jG߷)-x3UCs'RIin =K;wg7,XUp@R%C;L:d0D^k]gu'WW]θћMҮ\1׿t^/@ ̮kxOt#@ m'\H]]d+ xڸ tB<!%k)[eH`qh:&-why,ܥ7ά619NbjyBFA(Mb(V,4Dr疰 H\+4nZY0&%U_gקZfz0’%%҅#yM@jh~}~iE0֬=[=ܪvl.X\T{~i/7s6,gߝYXi:r4]v;*G3Pz-zk8dCHV GI0Fh"D4BGGa`J z4VT} ʄ0 Y 7cmʈ q_@lq[Y{aG6Lx[мql8B}#-J-҆1;v5r0,j~?%huAuٲSx#˭Yf%h:ׂ`Cף'9wNpMHuA" 2"W. MԈNEqg8?HQ7X.AmRqA_U갟B^%FצZrءL}u8.&ɭb6.KN|"I@g&`F 4 Rh6W}s<}z<޿s)zļ+C[!zm9 ~-\ͽ X+Ҽ?tk}yAa ;|}_FH{!_/uhXhhB3>= j?vu#-o`;~tprg %#H\%SlEeZn|f-y%dzy)"-Cgc:&%We`XiKۑqKzjD9HʳZ$I<+A9k/lOP45FK}$=\~ RDlTKB::2'3F2zb e*3%:2)v1m }YMDι9N;s^9@;H`UB+m(t,WjhTjfYAXPJ)0QF?e(/'<4~!/:9t#z6{ȯ,wvI'B~jTk\j=M]{`tPI6o&0'Yt$o'&QQ$A+X(H (.d t^w3UBJbJ&/i:J1hIw㖱| @g) $;_2Tg]L1mrPj6'y%] :ߕȺ27ς⛾mNQwBzdvxYNC[Pq_` TŽ PwՕ^~@+HWDm]&5/]&5ev(Z:5 ժPv|Fۿƶ[fh+&yxo~UxעFcc2ryYT( oImKyqf27[1tZQ[,(`q**`RĤMН&.a0au$'4T(F,$L+Qb0.'IHU@Օ ~I&:2bC,a $ZK-,gRsI*Cl1np⁅ Xxa&n51WF8Ur1X`!k0/ؔjSn}ۈB2VeP /:~=?wUt{\;"_{4֣:N[Sl |uS<dA^<exVyvɥ7]Ο>"`?|Wg޽}hLr%=^_O]N}caRͯGO OhjWe庇.[)@/9 g@s}u}Kٻ&7n$Wz%3(B{Î㝙{8-&c6oGF%Z -Q/D"B҂/l믋}wנbx iFM$ʊ@Y 0Tqk!qʱC'Ub_FTsDvנt{yw:m@']?;Mz۫v @[vI#.ea־Po3ńz /X< /79תh:joʺt+^kt\2]0¾*G%%/ ю|* S'( t䤖U2] YĈ= i!E]耭qN-g %p9)*L\jX"OϾFhʇcD"NWr 8òhlIsMbOQK` g`Jfبww Lj@ki=.|Q}5*+Jju;7W|́.^CUx4Q.2J z Az.F[W/&^ԁ*6^M5%]Ҽm-yU[(]U-c̨R}z_CZˮK^!B}FY+]+ëE 0'о?f;_QR>\>,&Rᙀ ~ЕTJt㩕oTry֓SK`R e{{$%/(<6S tm?s_yx6sd4PVDEi&뺜.:QB|OZAi̳]uE[! !Q7ajS,hӁu28CS(ٙyѳôd7 z~Wo]b]tԸj]+KfSr|E 7TJR߅#\>`ÕA,nHzJ)~҆+2:ҌxlG۫t~kp&Mx,Dxy$WrL|:`w;1s0̣94)E%"{T+JGStVC7p4R;Z$wK4U)єZ:TtJ! Y0-s)P~g i8|s(q圛+gfsk19&^G,"{@Ctlҍ@ᩌ#fa5 #!]F=YC{74]-@SN~SRM[{bnceN=<a*,҂v.:sf* ֻ#h7}c+޽dKn+tVE9#9npfb(#rc%rI9f/ ѭ]~wFmC+BSAw[~\3jE|,9K㪯"oW6FdZW=Uv"EIׅM5uT쪺b䒐\UV\cF+Vi1" 3uy ؐ+-I$H.V?k1YR)@NsV(e.?Fu QMg0o2/7i]t0Ú7C 2xIsW|ZϮ䤠+t%_>6/ƨztOi> }y!j%ej~R|F<;Au3Ixϗ,VlF`D=h3g4Wߜ5H{MwwJJʱzf&=&])!h缴SnB. T@0ʦ{?ົaV FJ8[tW1qpsu;5|匊y=_0&u]fǒtW2ОuDNa1)=KirJ&WU F$yUts|N76Uă6gBjwn/7(YhWQOQwȤZ"UT,yGn FJv=iw|i1DHM6-wNcU<~|vi|fҧHFHO>We9 ̈@]FDntR 4hךȩƤ- ۽jAm+m@]';{W xj{vq6\i6Oۭ41P2X *R"i] v3_Q@hvg$C4J] ]C@Kk6zG4f*InQGFmIm 9둞ݶG/Q*@㵵LKc@`g!p4䥖 !C[J C߿T4ZoPOo riw1Ԣ:MK(<YaHVp<ٲxՒzivv-&%''S-)tv]ŨYcΫGoK|65s7keo]j/T/~g1_d7x W?b`9M_g?5'NJv*SǤ\MR7LW[e=TUf`?>,cTo?fyR"n?_/A+EhyǨ!7㭽w)e0̫mcIz}1,]_KF zR*co}1,yzg=ߥc_~!B|_F5f.XkS~| +ߙ6P+#S={k=,su>,zx_ϵ2c>tPr@ؤ\cC($?-wb=.{8$/t, kF'>g BƉ}]nXW, T|[w0 +yp|η;ߌmB0#cN_fq/Ӄ3r7gj'JrsÃ3s'7c$"˻Zr 'Eٌ\3@=Z:Ӿ0^>Q`DFah^ǀpeAq|$CPUܧ JɖN{7yyf]~fu*By7O~f6<&Rm;eT{=.ݼr[-XݩյqTWAKp;N}T `3$ef(TK0JJж7_}aipQoVlTwy]DT۾jCѾ<垝 h"5^Gk U+B5 /:%Fvg[O A+i $w9%3jTF+I?`䔢˂K܂v[2   a[]7MF{pv)l6wh"a0"wy]4pʅ[i`bڝV3w/G `@X'bα*:1c#yrLU a+ذUk#eU4'&8@ &K4;3Aa$5H=cfZn#EX,tyRlyǯE$Ψs}?z4aocvS}|%pts"nԈ}ǨO>;ܜZVNKУO?n dKqek2A Gh#9;MGWr+V?0;٣Y gŒbo J164:^,nKf)}t$ NHH9Jd W9ar)$/*(%EYq(K{?,vϹB53uPS2/Xj8,AJEsAL 6ƞTPFJ'#%P(vGb&z"J"H(Y]#_kTSʙm36)J !2X!]YB`ύ,ےrVwzS (zDɉYd0@l5>]0&)}ւgScS کmB^3Jw )ROʄFYdaMI&^(dɾXwid=n3qQ*]>h?iM=f/GZ_BC^bM ;M:ͺDMX -'oS}b*z;l#} y΅ZrgU?Ϫ~U< d-kYє`%"k%rG6X#0W~oVU_}fC_vV]f_&ns6V߭ws??o[ң(ØEQLznhyo<NצxU|.\ ewFWgXGl`BTLuA ,HQyu"FצR~'A`S9`87ќ'T'0Q keV@Kxrk?,9u957o3{ { IQ~A'.wn V=]@TB/YYZme|3;%Ym'!S MFi3߇^\,>;l< =- ttt0 HxiBc)*?>_ O0Sԓy:ڢQ^[|Cw0;A`~[WAed7':l Ɉ$pNZErӥDIφj [%Z]DJ~dF*ƿy[ͬݛT17+^[WmݮE;*N&zx8- sh{c oGMG<i"t=r7[7wtGڇgz&q/(^aVQr=_~x+Lg~í[Wv(7zvl[ň?}=*R†Ψ N4rޯ;1޳/J6$6dщ$%b@-SNdxq1;j=4fQg TB7%-E2Vj&FɣP{Da)$o\$3{XRii5![S KX'$11i+\^kK/~|!}]xIRF c*ܳxHPءDe L`ى"*[2ɢ0@Y̾scHSxY\R}daӒxqTq'VuBߍjQbDyEH `2PYh_uES 8# Y7j^:hߢА09-X3ul&Ls0a+ټ0q{6sLg,rr-N S=Pd:S*&*){S% e=>ouRI!DK5 s(ͱ`TURiJ?ijQYo[! / AEbhg[L£~`SCLr-x:X9N¢1 u'kr96='@RP!a±&E?E|X_O¼7Ussáaޢ*ѦB1S|ob!@jNKʞ]Cjd=F7]Ti/!N(զ7eRd 9ɚؘ8؍LE'vN@>Ccm! 6S%zW69-._4?t/$_J{#jt/چ8l +Ib6mvys#39`G8kauCQyl/ݓ/͋rU_g#><̬:efo툫VMj~J#D׼f읶a??qקD<&Ͽo=i+t;wbIY8\M i}[i5oM Դ&Q#%:5*g?tp>Ks2i\ZU}}CKI5IϼU#.WXNRc(ѾG^7]QJF͓QN N۾LP 7u-&cHgaε5[$ '(+TbҊG؝|SG-~3GƐPo R^/_jV"6M0.:-#8nRl')w<^K/v"_ԩ oLXyogVeT? 5&;nݰd$#m^7 isWh WE$ v&{liV%&_8^#:ct |D=YDzWb/9\ynZCR۬"n3?( пCMD +: -ZvS2lNtdk?N ]wMu'YR9GzK~EhW'`+8=FTV鱯>˯&NKЏRtJOc: @ߘN%ΪJU8*qUncrhAz={Y9%AI̓@2"TNVf Jt,,G}ѠC__?gC?ѯ2=Y&boF}}ڨ UG=;=z񡐚;ʎjvI*-J?|'HvA[3_.?|CHJA`mWMAh߬dDi2s @HcY7 9"&ҥd- U^ka%u6Q&R?O53G-meزI߾ [gV?S=0gjk3躿&:G[>8BS[GF+gu7UdSAk;t@0-O>la,K(9_}̭ifTVD#OhWϠN24FNx4Zؗ҇bnI1RtMV)X^h ν¼qʌ[ySS5FjB4 Tl_h c0q2/Sѵ/"24"D(1ݤ8p#KcT !dKXvve1z2jn;f\1#s騶ۻ-\v9^Pn8]7 AP^ {s@IK݃Ys jI`׺ܘ 9ib[$/. ?>=٥yE֊7Iٟ9y͖şϯyvu^\߽c^zx{z*;Gugz븱_16@<\@yiWG,),JJMv4nJNpvZK#S+SK7tB>]87^9څGy5oan ,+xc\IeN!:gD=tW,f#h*@..tuX+ T9:=cmxRZVFʑ߮\U넽<$LkKmʲd8/8G9E7im][oo{ga\mN2>>1\q?:S6 %.?~[t%D rQ!ೌ2x;F͑I:UCPϫ?^[$0bT2a+0,?h]_ۻ5wc6LBH*pX"`S*mGN|dcJRse./ZEK lW~-oy|˻[ލ}XYm4#E_KMQQS9lp!R>-K_H|@۶y^)otu~-^T::zt\iۓbZ$vhar)Hmm (>CtARAs0)ee#~`.ugKhk#Qbj?ճóO,YՏ,rF_x;O6&;69Sm2>AeOMT͇_yG'U~PhIJKw).IdQT'aē$l2M(l\ *qE9A+&hoKHz#CKO$n{~ԡt (W5OYKiEl=O 5)![_>A&$X^(!keݣ~ "F}PjҼ{"t!j++SK>L2؈~`#l'wHbnr ne_~ڲC-_z˯p9syM-Ͳ{O8_x+d||нg`{GOix&,L 6%Ψk.q^4bٌݰUnE_B'I5VO@DHs4F7%[l]]N~bn޼~THxo\Ad_R$‘}e>ۧx.Az7AcJ8AjEeb$O^4--AkCS6~<pxt]g`mdb0 "J-IdX `q'BR? i8_f xةxvg:3Mt_{|CIaA#1!؊lY9NͶ7uRI{TͤRИ:̉`!py#Ec=>OS ڳ\iνϟV.a1^5@lYYq]mY~ZJVpCCdZqgZ2T8VHI=#zN8'!,V߯ tV%!#mM|4P%hhX*4V@Lb .8#mL(j qUIFӑ 'B3 6^QpOP5jj7TuNIq~l;; u]+^izKhC|X"iGf=˧+҇% +J=QU+#V@hӼ>[*v=ebÈVv2a+Ɂ2r~2{o|N6k#(,кoH3% |΢reo,B9a unXթ謨CɈQj9ƪ'</F Q,m*]xdH7"[q ȳx’. }A% 8AW]Qw2G :EeXMluiqC<TXfǛ jgRmvz{yPVW],6As!Ƣ7k=o{%~yqg2me?`Ձݛg Ws9 8ui)"d"ds=CT1DFiuo5wx|xP!P]PE%V7fE9+\+3EФ/!"jժF b1>f>ԤI:&Amm^呅Gh+]s+粰BRmM8BF/j-*!!Vウ:Q!1*EtLeUA@Ĭ|^,ҬLZEB$Lμ$QhXi%J*dbm,p2j-}; ֨5IgStJUKM5I `!Z56. йP;dp3dMklI4T#<j#y|: jv^xp]fM(dmdmYC! RSjuZfԨr& @vPx3K 1Vi1H Pd"!NX!<@Nh#ڹFbĪiuP;Y8z+eMT`ֆ}˓Y$/ěekd&vpUphP J2u>Ԯ5Ό B%S:zRZ[NRjdy:J [Unm@o΅r,xMUk.Z ?7".)i_,P6m MlRTB,TK^ #Y&"YVW|bP _;kacHWD, ( [4 yh1HlU(*Mq PڨZ,^Ȝa%4lue)5WXH%t_J͌C`.$!j*J/gB+wi*"1xڝqk$M2 83#{,IXH(Ll3-6((csȞ 6A "g@6z ,rj=E%EH9i Yl6H@fJA* w!bR 5L4x$s0u)I8!m<ܚPKAfn$4-u 9- ?Ž*PД "E(u _)K,A"a@ffB$ >gWXR! eVKQRw-$֝0Z8=pzypz鉼OM{ {;ɟޞ}Ɲ$t~0hJ JWoN?wv;`׮쾕v#q=f\NK4ZGj#S4U6u'HO<'D161F=O[]|sgӳ֖׬P%ɳ޶<>:M|8?y[9}IbE_;BJ%d&Rmg-ái2H`kc%z{=]<Ӽ-3*T` HL(Zp*$Rx1xc4\J̒\d7'uDj@H_,rt5YY_ȩXSZD ;s'KmR+#%೚,mR | _ү]deΥ4($C?x@ e,iyN5.4?B+NrCL[j}ٸuȠHfnk_!HKRXk[ exa}?^8~4Twm7廸s>wG~tBWW,s)%\Ͼa3@{ w9B]c8Fjl1d/ζW/1OOXVs]nk)TO}L0eT X^asw7koVfā}T~̼؄˵du%ڠVI''%91IbmݣxYhxc9RI?rpڻػ6r$r]WlXdO Dv$9a+-K1-9C[-vHUzA]^WOwEzb;5^ne_]wӐ%__ykK"NK'/E鯻1Z9~?f5MnȄ>r;y7~'k?4ƅRIY NЯ~sTa\(ݽLHYkǛKdlK${5KKq-C'nd(>_5(;Mn x :xXylW ځ>)L].Myb8$w-.|\p:13Qps8i,eaAPdbgbҹ)y rpD]}XyH{;W3~85}u}Q<vz.7wi~HF\_\X]~5p"{㙌*Nf2ŃވwנFPAK$5 3 ^H'k> /u&Eb !XCa ҘHx$pg%%C` ZhǘJyQY1jӁ%Ђ-;2":K._;$5"Rre?G=TkeRM B(fn: ܌ 4 Y*f}Q褋k=EJ3DqT;*5jt;QߙxXfc#=ɉ'mtJFYϞj):I-"-3{lr{\9>\G8E p1 J`@ROMߤSX!u)nym{ ]7wbi;A|ܫI/[ߕGis Bo*uo2@"=︉>4:ؒmb Nfx0IL|Aɚ(ZX |oM%MJOk(g-9:FO?'[;qϲ}e`@JhNRMERG˸ F aώxbdg~^ZxN|rl]/z>[};۷1۪}l6nT6hz}Jd*+ŻW(V(Gv#ܝƙc@܊ծP DUh`X^8N4rCnƵSkGKWoӔB! 紦\_Ϲ|kԶzPMIN?LuPaͫw14S4D;C$ {KC"Z}Yc0-2h$hG3`Y,|T)q]̳|~Bj٘Ooe/GT@U WHcPmjI:p`dXlݢW t |<&xcM ИMC(h]_{ gm'N]yW|HE'2uN ;ZMMr=HK-'!PiQOLo(=nBnw:)iO=N"T=<́"B|io觓~6 N0z_ѨH:>هH) UR*>$bO\-PmZSR 9.Z*WIݙXjG4Ȇ2oŒqɐ Fw-ڇyC]h] %i&I1$ᖄ  F"z'3Em#)jK%7DhmKFQ6B$= cBb`TэV { ` kcf} #rД3/``sе_46ɁočI5'ehd%0UK\L`ՁR OMx7O9xy|JAv߂`w&?8٘I`u|qr}6ߪ>ŢCx`!Gր.<9*VI=J xqGI#U1(AVaT#B!) @s 'UɑuA{Uxvꯩjּ4++^};׈9vh4}~G=%h <{R)Yu{ QS* qݾd%۹Jk4wcD}kGr8Z>>-M.N/8Du* v iol$Ƿh m̸1YG0&۽$x9WyzaWb6ڥ`v hM8;Aje}b>/CX![ib"\sVhJL&·KL Lh丒Lh\G{fppwl~8k[%:8Xz;~J6"wD;n~IluqM6ԃgC[`[DJ1RiW۠JԥS7|ZKI) b-(u1lg,βYu͢]R"`*&*R.FA(sA%yE\+Ă%ǾhTwɱ//ܲ\~b;!NT&uyXR l, %= n(Ziq\(\ס*%qDS}|^N;ة7BnM2U{MO_|w5kpQ^{ݭv.n.6}4JKq8d4S@0RZzWUQz1Ve,-dEaUJ7*ˢvYv`*^)31X6rE!h˒m-&|[Ω)\ _3Qnb\>l4*ǚW-/`kDSMd\s=XEP9LG?b60mE4QP-}bS$RSmDW$rꯕ|Ḡ*4]χ/vdԪSUDGrkqZ3ԹYO8ݠi2Z dBChˣϝ{a[>CgNטBKȰ6|tOxrU?jy5ёB=U +u9F_L5CA'$ϖZBc/Qm`눰n&-2D@ޠʯӂӋ\~jC\z_?{V_ ܣ1Fw[-p l%_g>{a$V$t3/o>_UtNbxm>ˀ*<㡀2o|DxլS,(N_ه'y}-/1h[-#< U.7nlȓ_I*~ޭ JL;x#FS7(nCX7^6H>Ոs˻ih{ح ƞhqnnX76beGOu9,aOC(v;$!]ΒЉ$_,,͗z$'rB#S{0OMnQ{s<\\NWl##A3ŧ;.o~i5V]im+w=Y'.E㊨&/OpmL.6ey p1v z͉hEf`܉BҊlvBzЮi#`{AbZyxxpޣMYtBtߴIծU 7nJH؉{XiNG&w\CR:_Ϣ$al:[S?1X9=GiU!2sfg="ɵ9/m-(9k'弤=$N ۽(1J;r)lڡTlڃ)@JlCPhM e%꠽1hpAuE { ָF}R]R2DfenjDI$;8*ǕҌa xJ㢛wwXqqZ+˛\(޾EvUhӜ0 S -1i.rAY#sU;Ƴ.Y-vk1#?Q^AZjMs^.B/^Yf0R?{6GN8{Bm䍟'.<v*s- %2nAp-Kizu`&)dᡮC2)i$:."y>\"@6N=zGP$0:z>P/O[d& H I&~!:!U՚#\KHc !!-%Ɩwd712#,*Pˆ<m\LU.rQsa E ENЍ=}|YQjLܨhN, AHTPrQ냀NlŘ&y y¸%rO#{D{0#d_țdžBZZrzrȵ9.ow,]=} Z ܃=\˻"c\V5L7q^:>xSxKa>)v@]>D]n'-" kRJ`Өi\- }hRP(24o?D$PA-+@?0rɀA%˹,!*Zrc8GUQ7a7NLKHىqplPpuE:q4x1SE 󎷧\i-qR+k9PٗIQs*HgEFJ82w>& @\q#jPINQ2 xrG3O0uưqMc>%6R=}66%,#5__ʧ"դo*ym=1Ǧ=ߏfWfPр M!W6{aIg2?'g~]jq9f6]f`GW5/|њg1%-FO;'O/WOS&16)􄌁诛lo @<8SlɊQTY#sP*x`AHnAhU,7j%N[wr9U(c]%UP̐".Cȫ,1!UQ.meȫPX]Ji y^9@R9@^j/KTdJG/>u@IXwvr-E8 H/ :VmʼjD>bEԨ.4SEXeʻ*ӊL~.d=pG |R,Yem "`R>`\2> rɹy %IZkpsj4?k9#&"/lb5&jzkHWTwWb%޵+Ζ FGWt(B0E^"&JOQXEw d1h5rdܵMteĻkP-83>5m)p&BGw 6{Fw|?I;gyW+|5 32fv+HlAkN?ǿ; R3GERVy蓆̸5BA/ zJ0Nt.>%ƅFХ P)E qܨ0nQV=x2(LZ@qyjЧoXӁ~-K; (8Pjy)iŌ98Է롔n z&%ȉWgB#5W!/g57GސAF<ЛW)RkQ_j7`%%Ke*=LI)ƹÔD.1t[s .:]xOwZC9n#jcSC]`DdIkBHjIŰ]}ւ1 €ށ)l4 T4J5K =6P$RYp 4D4)qugq-Nbm]MG\yQ5Z}wGv}ƫ YjK K`ֿW?%ߗ\gOyq_yR1ä́[_;Jc8d˹lm$ۢEMb.DN(MP\-aaP lh0(Y7)cM3?$d/m'',M^L;oK3 ډQI׺E{.Ǡ0#BeK]LuHl&eԚ ZQ(@E0nX <_2BhH:2^&Sb6c-kHƩT]iUyMVیrn_s\GhYm]>BJߡcBSyس9QJ`+*ǂTP Pĭ=Αٹ>0LtI fo!@6q@-z i=4Q6̮[ZVW!0n6;Z :J^3%= Rn ^'WryPM9?O\]_S WC;uk>H2YR&?P`&ԉxGB6=2]h-8Ayd{>,|j`W->[PHNl}ǫ;"VYIb>63_fn)K([]3nz~bܿaGm\2 s #(졥Zbv;{ډ@zAۏE|(zG ;^*W!F&dL a2Z?ڐ{'qІjInTvwm$~3ff\a.X*dRlJK+h9v~.`m=%k6U8x w-Lkm}@3> sZk(i4yXbW z<ꦞw YEQl8U J}7ՎZFɑ1 D&7.=w?j7CjWe&vQ7 Eb}F%Iܟhbd'ȥpR/<=y&T E2Ha6;I^Yc )@5PguӑؑAκ餛 2p싱ojwށ:zt& i5/3g'p4lQm"ԗwJzVʧLh ӌ|0KǼ}KǼ}l.iMmØaIAze ɒ5B`1&D}1vNUN0ȝ+3rB(+m_!GưD{$ SE1OEn9綒}HMn/7>/wuqH_Ud==$p n6Egș%CvOK$H`]_X7qv ҃2Zq6k6;jO~j{Vg7EK}? =#({1a+&p&F.^řܢcsd!L@5X^O?req e`J8=rhELj]o.xGVZ_"sf6OY591lQ-{OΫgJRH Z )G"vR8hаykt ύg*OHr[*CfhO' \Yjs8w[mIfcľ$d/_|8")eϝ?Io߬gǻO}UJAe>D{~S݇L۳~'f>x{~Qs%쀄DIHnYI/q$#/a$/"PqJ?oe%eۃ~D&JL+ 6F,sP1`Ε#$H KRs ʹ 3{_J#㝆1)Cx3N3#$ +7޶i,@Rm .xՌL>ĔIXB `fśn F+N?¥d,Wlʽ]'{#Ɍ8,`n |˰˰c,( QDZȔadYWSHzYms"n@g|>ݞ?3:zF) t)ڐօ 9'Ȣ)>BAxiJyhԠW8u *2:LPW} I4Q~I7 Q)ѥcx\Ή-^BX/3zXQבJP)Mz25)Itk\@/BK  9)ƏWw/AZ1'*TaWB%tu%JUO{*w{^Oz@*4*5xJ8Q\+C-n9mhɂvZDdFOH)P1 3MXttE'|:YBs҇QEV pLE$]1`v"/AP) Al-S Âf X5n~:T_Z-} [+IYwK9 p!ԫ7%)Z䚔mUnQ7ҲmhlMRA+gQ %5뤓1S 3-)pgVcd]5[TGf4=CñNjmwso|h@uE[wv1X_/\eV߬j|#**Za? ,$~EoagH'ZfxweG\pdq%oMɌ 3 De6n{+Y}Fj EBƙIXf,.,sx eW5x]c|zU$gĢ1^Q8Qo|}?||<OHtLwLˡl "хϥ4KrICQM u.Jc`re-gqɏ_gRڼgVr.[u;͜HIRFM歆REcR&ldW-6Յ u&X>d4z UzѮL3n.DSZٌ "c&\HVac54oXe4Zvfs1 xQeOvQPW<(d/|41*`*ԂG#UNa0{N1@Ž'M@ZI!=iA\PR0 VIIO7xaN{ BQِ 0SǯA0d:Q`wOOT A^NX֕QDc{~D$b&@^`&Uc5vAPi%[r#"Nִδ7eʡm# F@>wO}67ϒ[1- saQdNpIkg h|wbFX^73s&m]Fi72rS2Xѓ uha-ʥO:\1DZ>j wX5z8/ג"}L<*)M}pL3KuNȓ2J?R+kn#GޞpCy6E2xND7JR* _"H$2S]ӸmMjӶBnH$-+ôl0d/ΗVneQ*M=:s[}Έ8XQhv9FMv̊|s7KUl`کd>`Qg1^^vPQHQ)p]u֮霈H|:`|XCrٚ dEiR7 ފiET,A0qMym 氨4~&fjzSꈘPPr#3ATyok0F 1L>51b@MIvиzbAq tmGJb̓={<j&hݺam HVYw ,\/kSd4 S8Ypk"ļR!uz,U>:qXŴY]p0duxΪ嵢,kַ^-‡zx ^~oǚj^+Fy\P׬iD|oTgUs gpj.-)G%5rcPZ_K&QK>^%L\@RsI R.'/$ ֕8jes3TU iX!*iVU|rڌ;ǺFܻK5JBps^!jD:`OT[jv4Nwq3MA*+Ӻ z*Ciȼ*vXr4 W[ʵwR?.df0 |X5:+ 8Bg HgZ10"獁HDJ.խ((Ri1e]TEsʁ7E5 hr,[v.(+۸j.E*F(@^1XK%C؅"7'luۅ oj 8=/PplXP:ows )R5H@0J3Y2]KC`2.r4uPMU(QQϣNp3&\%~10ͥ dcLFmLE@jhBhHTR2e [榢8 PLMX(JJ݄KBCDBIcaiSs"H|+~IGzei$ruc3?3jMخh03K]V3u3qb Tk|NswXn˥.?܉Ua1wш,a "CBR- Pa K&$B1L%8E.'c5ERVŖT(TlN1 +%{:Eİ0 DÇsS,*[G4FϲU>ζ^X8w_>H#\Cn>~A수D\β&MJv?GAl?/@OgD`y;JKͱ <]ܻPaQEo?ֈilHӞR#)%_FЖ׾Y.4,``p<{抷RLo/QUc{# 5:DF( U\$X01M85f(1c 1fdzWW`#tB O`#&0yf}F|R# }(!f؝~f0I%"wۣQNMa6Š_d>XT߹@Sx2Gf Pgʝ*Dbtg{c \ j1nu=@II"d?fzst^ܭ:vEa\3AܮzៃLYVذӃn_D[`fۙ*OʼlWd.,>dI>aM?s}L_a(xt=Cχ嘝JAr+NO߃d0ucsVWñ &->/;`lSgo o?L8};:V|'-fKxfƊ+Ւnc4;u~<-??0⋿-2H x-r*B }O׎~2!JKղb\(nPdV,ːȸz6']1t< a;s8̩ ؟ ,1D`JsRQ!!oAϝ!o<0JI 2:B&^D6"a)9X!xBSi{ڭ y"%SXL纵]֗E ,&trn]G{`h$]]\> \u}i] NU['m%#蹆k5֏Q}C5״~5)sRoL8kfj裓Rݤ>FqT;ugyD˷yn{rO䚠`'R$ p83CX@PđBR2<p` fq/),7[T3$Y+7Nܤ$t!wSν/5 ( ޼dPqx>liUE%fZ)06)8U#J].ܲO^})Dm7^H!F++cfjk!DD!^Dm(J+RJFjVo B+9VgќGAⳕҭ\|V!L D̫[V\DXhRja6’?F(nT+iӥucYm)aJWmS^"P}BvN|zhqhat!K2JJR Qu&޳eē& K̪!їjxɧT(6L}88($hT" `6-΁ځ>DHp ǡg럸N[LMpH+zҽlsV4]ȫ]&L61A8Rves{=' }GXd]鈂FûD>f f3Bdc 'l@I(wP(U;ɩY%(ıjŷ?cu8 *ZW9r>}cԎ1bFW(w+ǽ +A*e9pU2S IkZ!؛;1_> V)o.q5kJ9)KU:u{;P[OvZkW#_T)grR2l˝JīevMEpR)ͣvX)%L1"}9PQlQ8Ra:U'ǔ3a04I P+U[EW()voe=)l+<GB }l@6sY|K>gD3'p~61iK?7sVS?2--}UWF'gI5]iz,)]p8}%-p<4f-2z9!T*#LW}HPwtGe{-T@Ж{i?͗R:K5et/Y,M_>+B*i O<~}Cn>~}AaHe-\OM{PADeGz1W_ "[L_HF!!7 gD{V;%=큤sxb4rXF%I;as95sרE1Kx9}AϾp7HJex63 ܒEfds/H-QJUgt $A@'òOa?ν~6x^RC7(Yu"̃pTb4!#D7W3v&ZhDѠa#RqFVJw7>7útE%ĩ'v= #XgaSvv#{X4`wϊ`r7w<筕[+V>ow"&:1 S$L Mrؼ&,M6+`.#m`Hߎ=?[ REz1ᢳ\\tL#?YXer}/F&ki-#FD8m)iBqȅG*biD1 D6՜N;x v,"]Nȁ0{/}uȧ HaaTd> ##6pvg˻4Xd-4{4i~K56oE|IdZ H!MP% BG+ł$?|ؙڻ]"Xz,RI:,oNtWx~<?Wvr|pZdXRHa Q0arn8YW3M/tכT[gKRzRf$Sl(HJNr$ES4nݼBef>6-X,34Sb,JFi^Bi iUvqUJ0.HSk[U&3 x5D!ap%Rpj~S+,ikM t]]zxzYzg.K/-X"QG[B=w:!.N*QrBt} C0&.qCڣ/^_Ow8# V^NX"ktDF/RԤ)R f ka|%AOn?&${ 1xg&=K|ƫk8~ESv'_pؼ[#yWp$ʄSB/.F0>dM@@f<Ш/Q|`TQ}yN0T8ErwʻT}xnCLC٫+Haeٲ6פ'}bV{F([W*dkyݍ!wK)YJ"#"LDY+RA* :w6Xso߿~} r48 A}p|_Li8pF#_zߘK}V}{h "/k<=+~Aȣ)鎖?l iGG#DU[q-GzG_h%9kGSyPS}OMdo\p!ʝHu[: $bx#GtHZ 6@RZ `DU(K[wK2)| A,W=|WS?'Ҋ窄%d8-mW5?W_? Sx)T^o\80FqEaiReRj!3^( 40WeI=ɟVZTnWU/WSA R5y쑌O՛~Տ8z/[3e=ngO%Z>Y%w*%E쇈ޘc؟IFu13O,=|WP}\iy1yX޾-l,#v˭&:*.E?zt]/hl' JgcAt[׺69X̳)ʦtROJ+Y}:((yʣpZSJQ}I5ԧRzR=[|};B> xU4ֱU-51m0jhQFLslgQ)R1{AO$ <-HITQTR *9dDy) B`iR2R:íoU>j(,oI]M\ Zcl`j:׬i^}s2\>Zl?~[{v!>!o^lVUޗ3si#dO3] bP*IFQ+'Bt!4j+kD?Z2}|jB8%kW'ZfMiľAjD+iM͆dU:ϕ4E*f*R -,ͬ%YKk#ܬk>4:QALZזt1: #$T)ڲ3bs6(@8s;DJ] 7[>Te f&BaW:+( e $՘\hLq~&ciSת Ko%dN`9 t$M ֶ3(? K !盳 8R(а0! ·Ϝc8 hXJIKImy* hx;_-[<%u@8Ղ`?n:oy~wz*pe pj#(iJEɈ%N_Tfwܪŭ~dյ&U@Q>6Ք[G#*y#|ԩĞaeI5.3V1VhrTQyiNSQ4# mf 9L aNjwp־{e K.jsÏ4NKNA*SPĞƂKrp|Ȕ=K1|wKZ!n<ŏMEI 8͸~߼3!*5zj{2v_FZ DnB(SxDtS@k|:=/^6SF^6z^swFt bR,ĉ1QƝ,s&SmUynA "HSMaOU|ǿE$XNvY(E9Owq!,>?Wvr|pֺ)A%!4K$"N1AìZا2pq\DrKǃK7\.!^isdE>ޠ3ǞgO]Q-J`6}J)~RZQ8RzRʴ2RSRzM)QJ[J39d*eTkPXIBrMKХ31Vh S I4-RfZnP=:2SN"=Փjm-R* Z !2PsZK /V2 L+&h0#@gwNю3A{H{lUsScAo|^YH  E^kB0Ť̤eY(YeޅM?ԨOhkP-J]Mhŭh#rz >/n B<祠S^JjhjP,0&ÕhM n }|,(S4f'qD`Y=§U`Z,ܢ]L[}+dĨ C2EDtT1ǪJN-8-0&֢mT5^]fV 8KS"\& >L7z\UL }KOH\WD 1n8M1ђI \@~ *3-i6j6X fdgg*yMe۝L(o*8pU|Vp[?}zNnz36u1@2@Ѻ?`LtY(mC1<޻ĺI7IrWV/[w)yx_-%˙Gr%a:OAv:Y7ygOwn2d `Cڬ^PN"zdcwub%f@]ݙ>S`V<`;wZϬ2d}B:gμ}يd[=N]= kOD826U/쪋OL8!_,U)ժe% 'i[Ӛܿv \(ʓt[Lۖr%]p󤴒1qtΪ ٲrڏ+/fPбa"`G>/:R,?t[BŸNVic Qzr HM7: ᩍJƫk8y$EWA#!+/z!ax.>9!T}ZW-oJdT(îl 6ՊGу T #QM /h}euO3b%FQ ܐ*v:4Z; Ɍah j4 3TgV-Vsxg͓{o <)tg]ߝ羘47>T_/߿i^Pƻ':ym#V٣úv;Au݅!X;إ]J tꩩDOm4C4>~Ө̝*Ѽ#z( Th~LɌN$ZÎ.6!ùHvPӺc6.rGg3pލ1{2c`=O `y2Ӡq OO7ul>9WD$ЍjfSeDwCPd÷k*. 'Uir mB~*oǛLUj>EN(Tۄ9Ϣ_%Y4ɬЛ%z.Og U I7bj1hoʍ H+V?q8P,i2FJb፵xU% ղZj NӢ#&=.9N7RAC 3O6$S` w--?̱ɬe'G˔YeqQL+Y`$,Kσ<$"le#>,G M fI-XFUzJJ vʕ6Ehcp!QZ0d @+Z|ɺm; 5kbFWK-`͛KOCࢿpǃ]nz3;sYsk-h&\-*MV=0JΎvZf#jŒzR j:{פs3N" ٻ/X5]Z(YBw勥E HĦiJ`)* ;Y8yeFP`v8Nlz=3A$GA>M`4>kx,8\>ǿӫTxkjgKE2#O }oI>&^<W׃GG iB菓G[\-T C+7k/,Jh<:N L2%ZF\boxؕ7Jmu赴(C?~yv<`t4[m\Dک<# 0噗%y]i6~0l6~P5\ke>$Hˤә6[̳Heb'!>MSa>Qp嗷M@6^gxUpϻ{48oV @jݨ*})xxaBk!|Qe瓗OwfPM]PTRF;ܿ,;JڬīE/]B64IC:cO9}(>O뾒6D Шoɨx`|KZ #T)=M$|+SItL&MՔ@x,Zp{RkQs襯4Lv-A`(k/tH(e!y(|k9X|.'j]J&@'8c(#VQ*M$, &!xt1@);iBhRcP*#D HB6N9.ymR2(Uۦ@3 T5GSI*k̩!V ^Pݏ*D!sA5 Uy!!FW.֛ 4.H t\̇R+',-hV%N a$  <3mSڎq-b C*]}7CI5TsP*p 7/~5 L[hfqʮEւ0Ä-⺷a6KHhe(7_ݶ\V2~T͢2ol!vwYL_6RϩԚPGotϾ!oA]>o9}ws t:}H/O+TRY^{&쟡H5;>vgGwѺ-%^I(<r4yC});>ݍefh;u c^mpiy,x"5]Ն}0U,m;s7m:B\%N]S^tRK'pt2$H8"=_^#(&grjk B \tJj4n(u/1J-4KxlT]8'mQ),j 0d}8I!SN&rJ"F((TRJ|!x qVF0rU9_@O!N"4e NJib2Dt7~N4`zlKL)ѴQڳӈQkʂT3O $1 ULP bLRQQ rܶvԷ4H@E%cƝ<3)%<%sJ&*y$LpFc@ 0=, psY@AeMЕ{T+ک$5Vy,s' PES?#Z}Ѕ& ܤ*RT -Ԡ]%#]k6^ăQ?y%6zJ[^4" ARYWp!tT5ӓP7fc9nۏ)Z*( >.8p\yR o=8HcDx14]ڼ8i@Nq^tsnRySӑ7`D(v}#~|m>hBHJMҁv'“!P|!UoTf+N!+OhIRP.qs92T()]#Gnϥ4k䯅#.,٬%? [ߜt62ugp_#(|lV%"=䒝d W@%ha0 XfZe5L_ˮ{|8ZC[R|%;_hݻwRF1zߖګvkq[Grfާz{5&mv bfCXRF'AEրXA~ 6B"C%BG_Zkah*nZ'hyOuR+Z-XKT5{#B" J3@ϽS Eؗ/]C^vSۑZ^ciY%h<ƈH:ODK\M0ojRa*Xl6F}ۈ&o_*֭hXdKEÎ*k(t&ʓP~7e믟kۖWƼMjV6/  y{K('WI-F**9 ;TL*y'vm-P KxMg,~L~ŞwL3Q 󚿸T rAwr|#?G{!s<_}cϠ{켱uY,-:ܻ>;^qΘSy {';9st:;qΨ3ʻE}~ge_oB@?̱]4~o/m"]yY{wܤ}ӥ$݅*+vל1FwVBˋy[ˋY]'kO3˩T OLM6Ƣu -FK|u4@uNct=UsG';?{Wȍ C/s}(Vmycb=y  jq8$v{}ER, uP-D%T/H$=t.KjKN ]9 N=qhA?sPE邊NT9jպl?h8<RIƚNiMaj,){$Fsc9Lj] .k]zJsv.FHbxNU( dFNRE)uYӪDSWLґE2[U))֌phU0/% DŽ6iqzuѪ0RYJ@5:hU>$9L֤0]Gǰ`røIRщ)ÄC&Nu±"΍J$26z mYDkkD/oeBC'v.@E X~~[(RQĢ.o{*ɲ9% ="8ۑlʸS$@ibُ}Np7A6{<&>6qqyhF24Y (zVp,sXls:1nhc{23&hV%ҭnx:f彋*bjUbM~r3}6OYqHA.'a) w#\ WbȘCdn!ҚTkY^אu!;r @#aԣf-xgc5):IH7`dF"Z~.O>0ۂ ɺAC{o:-pfeŷL*VJJ͖0䈶UF6 F% b`[NI?9*/f9"x*L9[CSγ?8 yk; Ps 'a8A&"RuIh|`8`O?'Vr$F KUhqix2z9b}ۗaB8CKodCwZf+YACLKR0 LaTv RBU#}BMD&I*8!:E Iʴm0ǜPµMu'cRRuPP-z/vizr dfhx[zz>-f ȥ/&#E! f͹Qyw3񽃁1DtXFo] lXd2~o>k6=3wRsL|ّF3G;")_ W>IᅥwRa$X] ~1 {CKRRׁGBE:N$e\m(  E3W E@&V$R vHv&M82r$y+a@%tKkBT-tn+i'JsZa 5V߸Pj.C|"x GRMwƕwcM Q䊃%wW(1!4VjR5D @R_`;{9#[6#{}RΙK|;j P0o_}D M#20:4Cgf)gN&%FF$| FkF1{u?A*/`DCS;o)C3|wFrh~m3:{zrScY‡{/3^0㚞]jkxxuy-1gD&1)RԵTA`\tvTdzChx`u<;v; YM?{xx/t< OŶ'x1v\x Vƛ'VuÖU㠇1 }j3/.f_7Ow~2jUQ&R:1ajUB|XS;PZgJ2zb;:e\,ݪ?o"V\BRکQq=R,$>?XVa\ߟϽn6Mq6@$]ç_C >\\fi@AW飛4_̲…уs`YxS4|O7K yzϏ,`xg=ՏnQryR֐q)NNޱnzhbe:cTn]BbM->1к5!o\Ect{kQ`be:cTnmy<\ufݢ nMhWђN d|$l'_vKvNxuz7qkt8?? 55fv9'/<(|= Gꓜo79afn]{(oOۨC1:p5nlez:%pW/Q.%&rJI.^7ӱ v:~jiMl<_#̋=Sn+clTS uPzd!OU3+H/ 6<0%f*|PĬo1ZoʴbnSgF^+k*1 `E{$okQtI}v!!$!d|ŀB8fr]z6]A@#—,4mRD/'#$=v'`hD 2e1<0: 9!Ôˬ֭J`)!X52ړAQDM:  oPFx[khTE#.:jp^]!I[UKIjBUFANA!gNP-8NgC~o…BhSCݣAJN׍?(B\W#M'cx0yb@zo=зEW/㐙,&5<)g4!OJ7%I¸bG".K9X1xp{n1+^GE b!!׋?Cw|9N`UJHFXr,c0GP%N Mn9JKp&8,S\ d+{:D@o.bMDEXa#B.kN]Xñ#,sbJʄ7$!eHƜv"l15!1r#JَB8)}Yh[j"[Ƶ(>7#b.f NE fa d,\Q ™(0w. nDHQc _t\is\dQbs.$BM(GBKX$Q!>Rj >;P ɰ ㉑Bj'Oc"FZ啒Rd&BAss#,̡up2Μba ]ZbCe\%m BS;{"CgwjNXbmP x&}f~"Xvyߛi3-3[Xps{^f lufTYV] 2#LZe$J2uR9O9= :t~U߭EWv׳)I;ATVmeI7,zjծ$ljWv%[kVKJ*"m¶Tm b1)`j)XU!)aOF#M*!*/Bj r).04FwfH4&S5h z7b_m0[5JraU7kGo"R4<s,9;+`ZpvB,Pl < >B!B;o5nUi}pcJBB+!jbP4l.hm !X:1ܪB3a/h+1|sYa99Ǖ :E1KyR}:A®þ>l^?}50wªoCX˧;Gq?Mn*$* A`xSpDpDߢP$F/l"_-TUz࢑W`cUlREUpS0.iU:Yma3%(:yU)Q-Q'0V!=bǨ?R]:aU4J mNu%ٵnppۣuAĎQźxL ڶn[U4F4'qX7ΐ>+Չuqj֙u/hݚА7u N$>Isz h~X(xk}*{|i\Y-re r$ 76_%@Ba}1>}E!sZh}plX0ZF`IRu>l "y-(>ʗʍ'-OPۻ6 N '4ȥRs2 ;^LNdP4ՙ珞 RqPgrW_r{?2@w2g^. `V_BXʱ, f &>?r,˒Hs0u/t ].2(Zn|73;oyO?dwQ֟l|w`ϒޯ^#}[Eʷ7 FЈR6-Sqaʖ&mȔ IHKR/*0ʌ1HLabH$Naa,g #-2 F Cj1S5NMjB89Wmi`4GX!1O-=FMj鳖j-eZ*p4X$ɰFsiIer#TIPA$64`F - ɓ[Ǩ)S-9ϭ=xeFZšTwb<:''̲Y(#L,Ub<@& 4>G/v烜SU(_o"˃tǟ߻1WO,D6 IJK~w0\Db5;0TXzkmW%g ExY=9O3ymP1q9}KN[v|DI3`Ɩb}"Xs<L9αNOa$Q]<(P?jgH6p?ͿWa I5$z6gWRҮ[;YAS.͎XV ( Egxh& 'J4d"L')EKYr?bc@{Di$^uBBuMBu(ƥCH5FgOAKY<#靉JUX\NJYKfϥ*fYaRDT nbB952FST@|8z,CV%uCYĻSU"nxKr*6-RKr}/kR$ei* TE_K XRbrdJƸaeU.s/DK!uZ:30HQͤ>;[o-pI۪auKa7i17wyd3ub-1DG}ެI-W3~,"&ה+ w] Gyb^Wxg>9O]on^$n7K$,_O%=̌2h9Ҩ?=-UH5nbu )d|&+}$Ҋi-HH;'*y1JBsxǫ7+ut2xO+F0r֍M#l!+lXþLyrd -f42e-(-TfT)R&AT'h%V_yb}{SZFحɟW )cݯJ|PS ƕ=[Q(]8WqKag?o=9_,׈\wIt&OTkp_i |oo&fD=4(83B[ּy:ucyOU% 3BŹnoT>}:uN`w CLHuc8[th9q4Ҥ YM3.uilvI/uh HIlpV>f 0U,ҌF+T8JS 1"HFFcYNm5 0hPS0r[=J{58 UVS}uVW>Ue0w"o#bLVnEh*'wHy=]A57G0(}lLvTŝ:-. (e*m/nZ #F 9(yV_a>ԜJsKu&D3Zx-]vif؆O8 >0H}?Jsтג9[^ ]ymI"8wu)[9B0-(!"72Sb=I]lZp`'%$3&o|L5n IDØ3}>^},A0ՠt7E`EpRGDO8*L$ U0c\CF[5)Ae;/ B+_6jh - М[pkYgչrZİYJ}z <˥;oSOVz6O=a;!Oխ4詇4`QOF[pDtvl̪i˖YMW5}pR=O7eH/ǠNu/mΘV>d)RvZ,CW롪x&Y!͒oN@ORܦ)RGN@nbeG/Րz\-јj`90V`P WSChm˃ѱڼGLJ'mH\DcdJ71F1Z1Ϩby":﨣w2ڛv/ڐF׃?qnX)QuNɃ щ|G-vF+_]/T !sm,SRjE;QeacQe!#*LJeV~jy>xY`J$lRJ1LJKUJ/ZJy`MNJ8 ]I֕^0B*YB]$A_cC~W{|'f._Obg~'mrkZm=ʎ'Onڹgn4 w ui -K^'IRi'MgרεHk %{K{y+[Ǟq;rMٔ?QQ[:JjKL:u&UΤ zP&QjUt*&w\Wg9dXQɫyXsOwORBAè?ۼ 8)NG\;ޱw@L0 R\B&P׼TYD(tZ:zv1D$hC/;ӇSSvӸ3 Y)X͞>yX@~>~^tk?٪[ ohU)BBV'>rYZ߾|N+{_Q[)ԑ(<=T 喕1J}L-, ``y sR { BR3y5Z~`(#HL* D7,9@AE'0t*zcw0uqP*&6[[sL8 ,gUvK >UՈ{p̈́nnCd61Sm[Ɯ Yboy>"mEǵFv35=nrӍc_,Z'y,;}-O}ҜbNl5/n{Ŋ]*-Z޷ž3-ЧwB?MOgraƄ~7@6^XhwH-rKXʙ{k4h$+!TOZA׽$ yOSGL㯽PJ.ݛ  Si{}mۤm\_c/9{{ݧ?2 APW6 P6`D$ RP{e \ j h#z/RM;jdFyG _cjFw@Lu{ZR'Eͮ]~O;0Z<(z\׽o9VIvu`Aޣ~Smws \͔j{_7Ί)-t@u4BqБd@'Ed;:DT^J6Yլ E0Ʀg |{*Jsy!3qTQĤFo0Xs*8oVvy8YTH*џT Fi' m f ! rV \P)IhҌ[зj (%U Qf߿o) >wSZ5z/ɟWY+9O/F}um{z뷷y.>mG0gn)"!,F-D|ɗ?>:-0/k~{L~\@Wm~{'FPopŤ3GF#.0PwǓno<vGXO!N ŝV3O`AgrYd4Et]Rrݚ.zkr' )r$c!ǨߥwJEK)SaRʔJD T3*-XHsJiXo5 + W)<)U %)4+9ssgOI's':Ja*˶J=uf|UIc=q]1HE 6.(?8skŒ2˚[`-V/L71s"u.ROOh.S\eҗo7MT+TӥʰD}6rkr̥Ɂw4Zh^%;pSf)RM#lpE !8 IzsH8\qIGBd2~DdrDE0rm<ڦP$c}j~DޏFsl|7m[ " gvdJ(ȴ3vDhEi! LAӌ~5Yevhv6#q͆p~9aL_=@)EqZptT>bǮ1%٩:p-v_>0W؜W76gbp[faYámaWdT\.:^j190p.j|]t%zv1Wy8۩pPU~G4/N0`ۺO`HI ._WJtu^|X3uBFYY8#X|кA࠺2A\{dUQN'ʏzW{W LFaaLX0~1a X+6ut;ޱۼg QrH8q&Doz1}8:@~n:uĉp~rSN9O9Չ-s]H:WќN9)S3N1͙`:݂u+u)=ɡ;T֢畳<=<쏶/p;38d篏fjV59Af,O|`| ⇈-ej ?qUGt/b0\ʌ^n_V@ 9Qs:s{|7/y Tro'paѬVmZx{w: w=/mEFL?7/lb)Vn[*&}CKa'N_bEVy>  SHt1N't47Lp\{^#hG.H{}:==.ٻn$WXzڙ=hT}Iz(cJ ~o.jD(_K%:d 6׼ ziԕwW3l[$弙/vR2+eVˬM^=g7 BC\Gͼ:M0 h H"08!&b,g~ H5IM0:i~[R[,އKr36_|\q~W B!U9\Q?#'?]! QKOɤLJH%=/us6\vq; "%*rkg3@+sHTQ'pq7X5>v]J]X*c2]e\_-W V'~iGcЩNFxz~0 )L,<u!W(paUR3I_+!tu7NDhٿÛ?30}-*$vI/yZ*-͊?__gmy^.gZB 7G Pz?-/4([bI5R@r*Ba)iS! R (Y /,^շmeyANiްjm1\*[|,>2҅A 17_B{h }BtPM5HyK_rrAJ0) d%]yήܜMUz<)BiR}53Ծݤ}l iZglau Fr :;?[#4&&]'~EVr:L:YTL|хX^Уb&g.t'Ze/:E6fHQ՞ҷ~E'd!e6Bٴa$YH*x@uEl\.j 'Y &Gjx3}_jϚm8KuRU 3ΤI?ޜ4=_< wo.I o7NΩ6F@ DRR(tOyrψK@k] ~DK @((!K&$$ +;kCzx* ʋbӨ84ҩEOPi &iSɑ6Ԧs%&gZEk9=ՌH"B2ZIZIN PmL  ^2J2M:ZK|2qH AQ|D1JBeA_EFWXgvn2@4+ yTW sc,mr (ਂX^ Bm+TA5 ZI(טMsk8qz\1Q᠐γ29SQ81%D]qPkт$]m@"X\ gT "2EE- O;_ET!ֈ믡 JUIڒ\bO"ȎۖpAa-fَXŸg{l:?~ן^h\32҇ ,FAvZY抛 Hjvނ HP瓌:dlP+FQ!p\JrNxh},q 癉D\ 8B0"yrL,l0D%UčjD1Ƣ^ et$ujw:S4JӠwEڲ@ 2UiZWvn2Ls(q@l֎|* c{:-;K\$ ƃ:br \5EKK2z:(:X@[1G3 'NK+&&xjsuG|©𼷓]PgUע3A,O Z@MQeQW ἢ}K#)VPg`;i2 $Ş]ʐy1yW[)&eRNh׎K+fs:ՌB(Nu[N7͵-f96|]|.}(fhrIdDɡ1*sjz%eR{N8O .ٶW72ev/^8@pY?}.D((V((6%|%Nq=r(߬0O+F~$pgNPlF3T).:څAnScӼD6@s,`"jQdd < #brzvcڱ@3罍g嵞*bsYά)Rz.$XF@$KT:Jc K!LWq:X,C1P:H߉6z%?^usؓ?_2o5b'823WC4Wk< 2 O8+uR-*.Ta.B M?+5[ rۙZ1rZPOȡ߽bo}^1&Lu3VϼUtLg'+$n!3F.AysߙRLΔ> S0 XVel-y57 ٗԅ|hݩB9:sԚ)ޗ56c X"}`~Cr#wq{c'b ^wj[PU狛b?mEcy?l;'>}}SwbaWG3?8֢=&(aC}~ibi7KDZ_)rwLɫC >X{rsٓ|&cSPx3Cj%zX |L'!mh鴖D1wn}X7nQ6?|%[,>6* =<[hæ<3JP TϝsN[x\@>Z֨MQ 2'>x !fχ9HXi|oq^[I$g &Q|}+P/:B݇sx='h@p˸wmx]q r%a%vrp/ w=mEPPp䘙aVi;96H[T"2 ya(".ܛr ˿~+6._tB`LṮ8$\^ao`u3hwZ5 tmnRG9ZK n+(6Ia윍3w@j!Wbs hyuhMqus麗ԇb11gx .ԩ/'=wBqg?nN4wtb&\Nx-߶wBqmؔ)()=5?ks=9IM{yZ|X-v:w(G]8Dr!]V7~&w_}^lu:w'xX%:ܻ?]ڽ_[@sy ^c7ds{l8+'{שHj M_uDlx$(xsk3G޳*oBbǷ?cěV>U 0 Z^-5R k\=ݦBuW CJFBJ8>TB)LP (Mp~s" O) T9+,?,yi3V+Ve9P0ͰZn94~O誁 2*X:u{Q  @޳y) u4pvsLձnX9IB)WET#LY\qB  r!IK @"h*[=b*\[/phJIIzx?xn^n`msP ݼNBp'/:MTSH{fs(A(-׌2-l%]V%/3W%H줒*//\eιB#ǮT}ίg0\&S!Fξ,/ֶoNgr]_\ q1w 0مM/< Wtl'MN.`l#ò?+H i<K&1&Ѝ!TzO;x.%u`H'%\s8$J@ י aFǣKOrkjKuhaTNAW¸C@0TC˻M]:;hRFlOq?}Ĩ` 15Rѹܻ,‰T5@F>_v; @SO> CwB 3_#6V|k8P:GwZjN[*6ĎޫکGMD'u҆#9kJe-4?Ÿg1N59NuwxfǛ4Bq)C:yw3FDϊX |L'!m ![|wBqݡFg bX |L'!mU"xz>,7цMyfOQ=:?s=ꎃh-ȭs+g=4xso$sAcƱvVt4xfI(F~*Ϋbڢy rʋ"EIaoU#kwT?"ڹyOIFܚf}}~}?xwa>}2=8Ա8H gh*P?.Pʥ-oVn[E91 YujxK[{}r @Ak+d>Gh`gC<%y6ɵK[afe9C5nB&qÎ|GoyoSЉ$\c߯c5 yJXcu SrJo֋yZh|v9gm|tYgMi5ǓR&BDR<(>JkA*>2yVI{`&PJ?UMtӇ]a@ PBRZY8C |W\-oY&?>g/ΪsEMO^JG]ؒdTtMVszl[yu|`ʐbVQwoF t?H }rkR ٜj́:Osmg>fqdC .HTGuӑJ&)>ꦓnAsC;c\*}hxR ~}vܰmyvgOɬv%ۯUSSSy]k:# 3,8F6tIu.2k4 ZP]s5ny܎.;NNgی2 7rmmf\o4r4&'FHH_a}>>ˇ3V}đ%Rzg獣dLiEQLL$6+eFISBRL ө!/w-xI6 c9MaXh :2FUf9t?^jU>Ox}&P-˃Mw/Ey Qgz3ə3(4,[edJIҌi.h^,mIs.y ^pe$y0(8h-KKTZR0d6h.)4IBZ.I?i- .%fD4#SgDxt{te>: ӅK;mt=WPĐ1uV*GZ|y/eJRg<<%R)d hf%KIKSR k۱B-nn"I"}K6'6]lC&ow*I'Xs΃V/&K*R2\ ];>m7tyN?[8 ʙDVv>,u?i,cdB\22V 9 T(RbV@TAESya2Ѧ3Ʃk #Wg $Tyc $/%uxiv '3(DMǎ}_E#i@Bf*yze,EI5^lB dG-:vg}8 ϲoA@[mT{vBe/.J)8آ3B(ڦRe1( %$FRlȢwO`jYb8 jyWTrw5n Q-]7WƸԂhߚwtZr'; ּWBjPT5 ֛ 5ϓ/N$S6vOƠu<7 c]n7G<9usK}֔Ǡʳ]H@BON9=bHôQ!, bgz㸑_e,zBd@8ٓf[{ ˖"s?작̰/= HVn,Q8zN_Dy|bSmPrޅ}8_9>Q[$ Fƙ tB0>߹OOnϓ1 S7sBq.M6ӻ%- &w梒@fG(9W9mrZ7(rڸu2%+NHkw y‘NIFܛj]c\@#NvWq -9ɯ'TS#:&nl_ln:}Rnҍ*hIwYPxӟ.x m5XHH_/GM*DN֝޺zlؾ=e5@軸jr]DB&:ĦLTw'Mc5w*.m+TL݆7Lfz1,nl GOi I t~wCN Gn#XD[6e2m,LFƸK=øyóu&yNK- jyݦjQss6=nS\`*}kNJYiM5D+=k+MF"VVڼ{iN+GUj`ҹV*3ޥLּ2Jgkr\2o[rI39pM;.nEbBzIъm]}m/OrYxlv]P%YxHͽJ;(EλNFa{?#cgn(͓],ߤ6ovD `ͯ|޲bo9W5{m_}Mvm !8@mn*_+>hw;hT-KGvF{ws.epǏZW򻢁 AEPXhDB>zyBËx:$W瞅p? ޏzA@}pM?ݐCx&|tl N͈o7dVڗIǃ+ԌK FJTL.1)zn)TQpJ3e{JMdܱ94Ҩ$/-'fKgeɓʝv]E@B)pGmG)D\9N?_؁.93fv[oVO7_٣7Cyq3p(QF;i*5kue2jo/a=r"DݳĚ}euEO!}{te(w,DNY1\ Żw㓣Exw= V{ jSm.3/L+K-(6rm:ېI_$d (mD}go5dc?%u]SMeL*} RmPS͹"Y)+ݷ)ՂH An6 ]^3 % %jSUbרItMV ,:ǵ%3Y\F"3㔚je8g+5rzet 7'E5o]/J)!4$0JkybgmTRAyVZS”]Z=DIXiAXZ|XN+%b0VKl;HU n})`끐?DFH>p,gxs8384ghy}Lhipan1'ԝh#ȵ ɏ?lP2u,6K[_ͺ$ eumݗkoosy8+tgQnv*[k330"0ɣ-n\4%X$'  [tiZkχ:K<( &d,"&EE:ĔT*AE=V`7+Ṉ|C~U, d:çIKפcTJR-la ° Fb7}ug U|as60өms?ۼzK_p(mѳB(Pbi勋#qгPk)]ݱt$]bFM8^V㍣G:xԥp<.qx/ݠ!tWͳdLq Jlú[,b"*WY5^/IBV_ ?_z_7?@&' 抚)9APw8M5W]λ)J4G=-;/I0%2JLPLVnUp#1ʂ > Ch<(f#K=?C_`yIJA䜃A]dKGPAx>}ixFO17Hl*ij=}[9asVn|42#;d#T+ bׯx\LX>VfQ4KNn-7F%|}ӻ~.W=3ۋYPUIWdxjqmW?lj1{އKmϤ%Zlϟ>wU=}zmPмZ]>R4GNeɥNl>v ~m0PH@DaFW($ Q9ox_ەyTP+DrvW:@:V<9יBK8kbv6&峂).Qӥgkl_R\?pM+嚌B0351.Y|HKqFwb  1`Nx;LVhS&& ,'xr@0"3zkbvl2 6?GL ^FVX`4:x9:)phP[S-D)P{ DTNJYiM5%gn9_W\b&%֗ۘ}cni yMd$( d*@oJ( 8|pK~Nx>U+j(6$J!BڨRDm g.JrX4"8O XB^ٲ-kP. qAf+zKtugro?W7Cw)]>LP5(Ge8YFn=1Xd R8c*cYjbZ5:#eϿtÎ3>@?0aڛte,|T&3RLNX&/V̓.ȸPP(.E ክ>QG@O㘆4@ 2ؕS#[=R#]򟡨j+*^?F}YZV2~/eg}\6W&˞Q`" 4*8`aitdKPtP`ԑE/@6$9$/Y5j3%TF#Q|ڛck9XEΠ1“"J\E齯n&.4 J!J%W]C %4Ax*PPBP^pIItCԨ /V`iq#̸|~ )Sc;n):UEt=B"IR<ߔ{:D}OQVg8i%?9ϕO;=IzB= ~z3 (m^1ʀpG&m n;|fr5?>Hl=O\w0 STik\u~f=~|jma O([2!S|.8P1iT1Pu&jSOUjdИLmV+%qw: ;.P0'2\1uu[(8 6Єsty$| y#y9Al4RyV 'ajSAJJef"FלV*3r^ ڊ˳zh&*ɂfTvF8'HzdMQv! f$,vhPORK^Ƣft d l[~֣[q*a?,`yyA.wwY 05_VõPD%'n?W7OuzpQ|\ܸ*Z~-mnL_+>hw{?pjM&Mbo^znnVK@bpTbj|p5oP iC?o\!6P'ކ * NniW (Qyg*xF:b7XIÒ|$.U4EGt$Ǟm覟n(xqd/d&XFa9W_S˕g 6'S^Em5x:_XxejO|[>Vi77Pe:lG"n,%!KM>8 T~T(yTVTk5O:yurZ Eu|@iwX}pcdZ\C6h N7tP)>I7L9e:dqE dI2LY  yC&E .+:ai%w t4cʔ+ʃٶ#)DY Ղ:vp@N71CXku(_7-yqrpn{Q;izIKN‘Zq /a=r&D#4!NU&+jɕ1o?p*Vܓg52ߵ`I^j:B)&PXcXH(W 2ŨK릶:Mt $@2 L0T c-cS흖BFR":nQI%\W,@ʋ2a˶afwC-*1ϊw86J3P-9301=o0aڕX4*׌ۍh\=\y` b$+r;Qr@ kαXH=XO68XH\"`$ &*^tKGys8E܋npI!< d(7|0+=ʱɜ"cWۢO'Y 9-$nLZl| S429B : DĨ )'1jAK›gz;_!%`/7! ,v02035LQǁ{IJ:Hϕ̌ģs{U3,%mDpub=P¤xdv\.@+CLK(pwAEe$ٚ Di)g)iVJ1cX)xQT6/`0X-yHEq3ɵ7@3:%fw1& ASPKq Ɂ=&MR\rǰLt{ȹ˵8׾zzk/\{åqTr=>aikl^~nJ5+8; M@d6o\{uY\=nszRBG˵WAAz\>U!9N\;CH1 !UR5W^/Kް3Ԕ-)Pd"VsAͮPa3IR 5P>%={s3;߼&orMy7k6y#`S.4qqs7_m7%z&Hr:b2R)&yMlr&$Mua )Pq@9`А6-&ldkmro$BAb=<  )JnPP~P#4sA~m&k; oTR4M I'cIE8xJ"%O52 C( `%'="YVpE$3ۄDY,XyFwg DY5Yvg l=g Ea St} U `]y-4Eaˌ( ހI'3bRFlb!$J&8!h쀡dJx Q6%T1-fI<3uy-K;¨nTCpK[<vJq˷$)))nl$P<rU(FEX»pmP"oz6l.܇wpkk]w|}Z辻Zio=, ~)IRR'*"J%셫.yHivb7Y9^/tP|-K CZՌ|{qN{6`:Qo- S)BπbTQM С:F<1 = /\JcB$Kk$&SdQ~,RRyd D˖IpE"G46h7&%c73pˑBa8HEhlM= ^hk-CKp*Q̓Z2&d `«Iۦ( r'mMfm$5ۨø*տ4O1x.&&|\u9`y5϶d|ᴿM!澜>}OGVSGx27}I|#u̬e~,Pxs|ZC `8V KwJH"$%,_™.) 9OQ P~'УFQ%Ѽ3# "=R3jmށOO()Ţ Q|KW|hpbCn -x0 QGn2zوBu}U97`38@@ߟvbzK=ύ~η _fK-/7`Z)FԫWk 5wvP y^Hm:EG 6 Vh͌AE3:\oQ S9 EZx ްs*0 2AF"2yɅрF'J呣dqdID(Өue0R25E"A%o 6*, |nirF=8ɓ|6)Eh8)-aI&! wFC`w-~A4e(#]4#}H} ˜|R _Gг៽ ՠHWD@#7߇K+٠aRCҔ-Z6^'Iγ3h 2쌁fg4&MQ#7J3Ò|6T_-:i˸kM3AźT ,mTha" rr;ǵcB7:JjZfQ$$wvuϳpeAt'r蒣;+:oJ$IFx gV;?]_>Z[oô/B{]uA>"YbѤ c$X/.0AUgP1@$NA-#)$8')xEWOtA[ #.q)iHL•XяܢiE)cVKiB3⥢#k)^Cy=!ZX0VXĔ@Wrd&e`/¬ S7ǎMwFa)-Ħ{6rb|%ouJm;ZJw*5$FpSH(//SrâG+o-QefŅ@P#l>}qѷn2zoQAp_B@M\Svf`"*"h-, ),Ly[М !0bNkmYZ; S%] ZSM)T;ڀ^FF(wҞjn^kۇ@YI‰PyP ; )JTUnq]9j<7'+Ĩ'ɪnZܻOvp=hZ0r8?&L5?\{+2şƯO%o QQ>_}]ZG[sP*/[Tmc;l86[NLLi,ѵֱX,єS|GgvKjwȒK\8%][_Ͽչ;cW\ fv^ͱO5g|I}Ej~Rc Enr*T{:L_\~ѷ >}~T go7;*o~F ί݉Ez-7u``p~~~- *T΢Tfr!4kI>BC.3j<Mg޴Q¨Cm)lX!l*H_:|bɞ,iڊIJdIi\JdR \6STP:Wx!\Ϋȓ@s!K]pMrJC)o1 ytJh8a%k$HX`dmUrIٔ(PDAv<{Ÿ1SlJÊ\t}}&({LixKxw4Hge]"fֵ1%¾Ř(QLM 3˯I# "MfAF++?\f7C8jtۃX\zz]Y&՝EFGQHz/i4)il14NH#2m)d@F.Fz4[+3]_Iruuv7N^W"<@/~)r,BЀ`$XJHy$&+BRmeX!bD+xÞjovRn>mg;_~BM|o9Pv1r*]pR2[AW3QJDxK!a㱨x"T"x_nh7L@~Y=?"ppj_jA3:˩6|,ʰ9I8p,UpN:/\RBc;PH/LY+ҔTGAr/MN!Z .0mC$ч$8lb@⾦(2{Y,Pxg4낡1Ub,jh$s) >zy)Pcf;OKMH\s զHv9EȈ'yBRP2MM҂廴#J2#! 4zU緎cW8(QL)qcxrTyGcp2$ ?L,#Xh2xUD4nRQxF%]s)ɼ #4j)-BUFi \I(\΅r`9P(Ez5"$ͭV!x{ 춶a"P v6p$UAW%"U/_ <((z)~ V%?ߡ wMT.G8ᛳ;.W[U^Zi߫*W _9IA]xU]KBABȳ)| +o}l(FƤŞ C ǥ!Oa]΅W l'˛w~uqg1W%<b~ݾ>?D׌~[yQW461V9Z orSZuА(M`3!?Ĝ:Fh5UNnY󅸕juuG2;MZH$u7ѧDգKq3HT!o8=~,׍.6㿑o%(4/ B?+8 z3/7@c#ܹ& pX3F{(IP%hɰ6lRoJܝj*PDusTs%zc5^ciݩm;o797M5d{T9"̘Z5:vVMѺ;ŻNg]~UZY(3B'ÀoX/mᜠ3r^[>.by4V˴=kc^\MU[_u׽jS>z Qևhio\E{TDXW֍7Fк Fu>#ź2b}[xm h/"!?n~`b:}bJͺnCho\E[:MBZ7*.Nzޕs-iDG2_ aipMTU4f5,2 *u=U BpMٲhC:?y! 4d̀3 TԠzk&jO^4Ytː~FLBz$xT!X9UxOmYIXrzHR€wrk aAx+CEIL:2猓T9e;?QL5l^_c` >2|IGb%6&VD|/uYȅ Ϸn^©W=`jݝU;UUe?^7㿭vnS|ϲ 5az_3~ fEC?,'V Ёյ,(~_ʂom#* B}< T\;o\<dԎ:2ɹB˲> a2I-!, $yo:l))mk lٯ4\7TGf8-% j./[T<$2iIk)8-E>hqZ6,x8-*X%hԔRKwQ}٦P&-=i-RBKwQ}٦Rqqfi@30wT$]k/P{٢o_*:ҽ?Y!Lw=?@O7&go;<3ջ7UX ).nXtK$Eҝ"ȏF$O)E4 .Ll˰+qBFmO:ժ齝Ğr'+>hPBADc~dE"=`ݟkWҀP8{%,_rhhY&]~z@06v2`*&2cd)9ˢ@cv7æ31Tu!>"nqe՚4JݲS-]Q<-@ɘsgߛ2{WlqYJL;[Q ElCPUhuϹ[wg x Ux tJ2o_l 2f%X`Ljm56)b0z$2p~.>(1my!,GV1\Q0 t7,P$GJ`ZQ:HC#I2äN*yΆNƜ,STRI,F6 ywVlZx<8 ܎F>@-6S maED4`ia6TS&4wH-Kh3"LlϧI$/UbgQнX/B p>O\hD (\nuZV:Nj2 0\VYpJ%T.e/9ifnW+?ʹMz}C2׏j="?VouڷZdF7T!lB(!1"B: QhxI 95+@M#4Ja$"iτy:S}(ۛ7 *HGBgs{]ɕey T\W~}Boy]i u #)ꃢH*CN{2LbK*6g|$D' )9ZY¸ԣ 0)gCdJ؄ GJI<-$"pxRBJ9{!.Ȫ9 U7Nʎ"F6UG1zatSKA-θ4eF' @wb!$/^OTvF-?L @Wɤ#3 $o&N C#C4*f!.þǶD4?tQ6w*<6[G3TH P]Eti(  A:}GKՅ{JugNJшYieIgԍC-44`- `e7.=IBE obۧ+Vqު6mOUTE Ңa2*]&]_=71PtD r#{ZVY]mFj(z꧕' u}+8z3D>+Ś@T;yLLr5 \4*jÖvI!jqeaE3'ZZcN06t,KCKTG[1>:ƕ,ƀ+ zi)1RS>oSRBⴔ~z]T_f@LZzZW MQGⴴeo*[hKrH-Eej9򧭥\@  N,w4" 8J̍J.V !4Ui7m= D$a[ gN{}T_>O{/$3  [~Rm<0Tp2`9"4;^̔"ey&ڇ7"/<2ƔGSS!|k<1NKgRIY\#}8_;/̻;Em?vw&#L(Hܰ8._ `P1w x)C{ЌRGׁWV3C_Ї+j7š3/R]/7޶/MoNlxۓ&UY`|v7}pt'Z!1 B01^ma3soHfusϠal#82KSRBVBA2˴.E)ilrC8uv΃8ji^ܝǀ a<1,1yIZx^##*[<,>d#qF= D drwݍ: j6'J>/>Ņ~x)L1'g/C?O]:ݏ سO/ m` b%:>6*ԞLvPbh >ȢЛNo{x٦ٌQ-R}ibJHQ$$\r4s) sW3ŹC$Њ2< P1/'iO]WՏnp:U?C.~chnQ_|J={Ծ}mM^-~[>?/G OͼE<[{\$W"׶ЊkŸ-(bhlU)Z͊X%DUY5{sv~|s濾cy67[+s}߼͛OC/ſW(E^ yXn.{ "]B>Ggvƺmfy?w9U*RHYR)h5AsJ޳趍$+fD8y&LnwMikIc&)es Dj*m46 q::p%&ACp3#bz.L0 #`z-igR^A;+3+M04FB[Hll6l:Vnv73Y:^ 0(%ڔ_Lp(Kp#B"_RƄZGH-AbeD$_+asM;텨tQ@of/Bl&Կ~<.4cDA ;*7x kꡳ.:4=;yσhqiF'p%¿bI;O0`JrI܋鳓8VQ=q{7 s'|\כ眖gʄNdm\>NfAlxl~W2Bm)( Ωz񥝌̀9{ߐ.U&{z-kdZHm{U(Vt#ͻ-e薄Mm;i&oT۞y<Ksx j^dgAԱ'~m ta_;6,u"т ķ?Č3R48sC{15 C_C|o@c 6ou]٦8u-}I/o]v)bew;n\w~~6U C#34~Y_vsk>/A3۰@$ (iب~"v^1*~O󪐗1\iSnwMi`_| F+AΞq?VAo^IP~|{]\n8߅?*xy8&A2!fs!Ǐ(?6g>}ȍo(4'Omٟ|ew_b:w.jB d31fq ߟMhI\DsEX|qmhXixY[m/Y˲_J)9 1 pV=C__vx7$Rk׎_`Jg&Iц>rṱZEr瑰zBEZ]r\<_eJ 8),p 8aJ5W ,"@-,V#LR, vE+1O9xjL.˞[fzU^W#*^@;)}RJ;V(P~&B rp)5-Z \?`RF _yyӢ\,\oAKRUR0vA ].c a0vA ] A.8^r%{um,ּ9o ~>Qw>#5}4 VsķDD̏+IR<ԄU;%1"H E(ׄ;X̋ȑm4%\R~Bn0C$XGڇ(aL$EȦ8ճ6BZgIt|/ a۸' *&Ϗ=sWdi>Qba o A[""i +|bfˆ2}da-դ#kIBėMGi,EVF"$`̄ӄR 'mckS1eQg2 *tdTMm"?-Ej@; ɕNBH^6bΡ!< I='G$UEJxzN >c|k<"D%Z‹3  !ռ5ŏ$Ur27 _EdOm9םSd<*Ύ=@"25xjL$la-m0U3'5iB"*@OK2OrI`6Pњ˓p"Uލ20Al-)^S5/t~h5wGqgUbq鎠9J5J<Ns%MAwQO_$E ,ė#l~g(2Llt5YK&I:g*q~T9/i"쯏OW ]!X`w0l Vk#'D1] <44`[RKl _VIA՚NLw%d;p"APƳ{?,18@`8^dM['Ej2^x̒-tk?`& {Áxֳ ̒>XPK.`>)c4$US~J'D#ؐ WG SqgA'+!B]prXTBXxtQlG#!z{jPrZ#A c`% Bl)$\Q- B'GO U s-Tt{,#MiZBLk@\8l[wJqlSbNMc6S|Ǟ8_{´&]"i]iQ5#҉qg`k2ubTa6LûRͱ/IjvG 0; c>s}$IYU$x`"!94'0,i#cB)``gMw.*7V}Q|,%'DoYFva+/og/ZSijCJW٣:VA\@ aqLHؖ 5ZK=cKRa@dhΝ/I[EяLUv2J ]ڋzJ0ؽ-tSu, }7us 6aREJ`ʗ-y?A)eW}VS?δZ>\qJs-RqϼfB@o%h}s}0Ɗ鍅n{{]omj6F'.=C;H'B8}MsњN:f_ۊkt>j(zoX˽8J0^g}콏YW\XLA%kUecaL,~Msr)uWVnh%zelUM> -Ȼ :Z JF$tM$tK *^"M uzcLOZz"N(kzQ^/5Ż~pc:=|ٰkNlxaM0c=% CJj ?h[}V~a&%a /v֧xNjg@[dSy5ϟʸ/!k0M73aKpaۺtSXNQŚwTT**n-XJQ6}a>6o1?zڍv}`GD(̃ԑ:A6RFBERR_ #%վayV8fiࣈd2 f ZIC &_ gc>Ѓ:b.0 ЕT`WaɇQ@t%ʪkjZK*wF.CɫW.OZƩ n^O٫xDĦ;iXT~)/́jz)J"`\\z^(HS,Д 2K~뻐ʺE r|ԞǑMF7NNchtZ{8/;65"ˆi>­qiN1(Iw|//:Mvp?~<<|CǷ;QYfϩk0,`MY9 e{0G1'a 2t *[>}l Dz WliwjK4vqIf*Hq/&0Dswb~,tm #m%gΪϜ~~h73Y:A^ 0 K_Lp` A5%E9>ׁ1B!3FF_3 RW!ev,Y74bnѸW&}K a%k[F2UάIIQ̾ztuR׻WNSPYEO֍r7{&Σl ўlN ڤ7 \^3tRםX ٟk92C㗂ٯ~9 a2m|;|,'%X6,p?>wCGZ`5W]}^Y۠:dW97rMoܹw4N|\E6_";A`OmYrKvws)mvT%Kn`]bUs_; 14no ybvGr_C3ߨwo-no4O>CDSx_G!-?,BT> %Q~ZlFo@l_pXNm_GsŋB<|_e=cW?W*\^7e0^gtЁq6Rl[8+WY~!?yx Jb55냻sW\qa"8?^uɮ/-^Ӽu2g~ 麔^#0VsiCXMp1<[I'~[>l||SlCq|:7 mS^G)*rߔL$ERKb42 rʫC+ZOmIRaFgӄ\ȋϕI1GR"" <'%V&KL@ȗc5;J}8nX0Le>-b`wY wRLK${aBC`~x}!JNa+ Qs?B #|EThrsudʢ!dtGQ ڰy4߱-7՚R4]4RXf. +!Wx^־3 p'g(ZP3E8>O ǀ SԒAom<SZH<A C4 JPV`˲K﨎Fe. yoA1s2s3C(\BYi7^Ezy*W$@ł8UHp *E)J;oEկ,::ƨа+T}_;dB͔7.Õ*2(&pC(:im*c/s-8 Sh#6[K o]!/#|{'ۼd%ݿž^>:a4i܌ƓIieDKD3UFcU8,=l1h%ҁJ *XTp =E#JuCyM`˔GX=a#P M(]2!Qy7#Ɗ Ȅ[P nJ?KV҇/t~KГAR݌Hɑ$ sx@ z%ޒ!=cE9@{a!Rڝ=AZqFl7Wu⿶H-BE!/esl!]af=|>wD%[CgM[OmJ5@èl&&i^+؛pO*[ݧij{+s47qQK8X d8`o/҄Pi\fK3;e8vD͟>&iOTځJ ̖^D.nX.ƍб={\8j]F)B1ˈ%͟X6L`Dڋa^xXt9ZvpZ9W&6JܕvU2W]t1׻\<>DN~秤+x7&OXI_e53 !do [jA".x!c9GoiikTN5 bxѻmi} [ OُGK^MjyM+!:(y/iCXVGx/r/5EG8 %a۹fi'T@<=(B cZ.ӴϠ!jdt=Ӫ^x|icLBKZ&%w0l"^F Y*YN|2v$]H`9 v<߁p$Fc5*` Oܗep=lYZKᷯFRsm7Z- 4e93=5)BrՄ,!簵Qvxytjvp6*f:p̥9:ȅ]ՉX~"(&؜J-qM: d6/mGnP m5lja"֥.: :[ÁAԼEtކl"lU,%q^Tv^/2dE`XCPPn龩*61AaTo"gqvWʧ Ltgcv,k -~J_EJD%DbfvY0$u.fP~CX-v >fؾɜ#Մ@_JsPgs|ˆʶ+TZc 'j8yT/S]gfw`pFìq2z2[9f&<4wazUG5bGiW~҇K-vDf%{K0lO>:uF.!] 7$(ED0b xQJk Jc1"5RN.h<%l:/.o+ 0?c] sW]=W}negd=[KܯSYe >NLJkRI n[_\JPdJU.kRGm= m ,;('*b%xwU5ߠN] sE ݏ5nѻoq},BXUEn^O!7擩#'n4 kvޢYH< ۩eYUVqY4'2F f Ɨ:+;?~o֐iVYJ\ #dfZQK?"eaE`„.iXƯ"XKF $cd!9K%upd[" tTCNw&IYo NdjI&91?BtjWs~PBN1o[e#FW*ŪU5:E**JSXd$l5q.{1,z1r? կȏ)gD|ЙKQ M9[fʶ\\N.҆!TQ?}BW>P>0kMUBYt:%SH|G $C֙&{ȩoj.g(ej[f`h&@ FRaa{̻]ՓnA+cr1NW L9?\EO@2F|4T"ޯغm#c;V1+;dQQLӫQ)jx $"z*g 0ʗAO^,gZ.b9II)t_ ,=ȫŧ${29A 1KdHhϩSLtp~.hpAK3AT\zO@%KnmU/9K7r䋾|$VžtM'utc. dsYu^{&98$CO.XLi2T~״N#}P{ӑcĨizYeP'ɽ;gp؊t և\ҷCNT<b{eө,bGiň3omAu˒~A%)PE~ VoC|ًh$Ϸ%p.R!uA8\up!SbQ9H )Oeh@$Fa!HYrУX =pL1D*DB9簔\A剰sv:/yr^Q{kGKsI5RX?8"O;_Gy2J4=Ïn qLdhu8ꭋ0QI.ZųaŸ^pk,~3>нlb&?Eg:4"M> fڻMh!9OLmrx~rţbT*h6⏨5 CC L-Jr& M22 u䁪0tYUv ,~#RJ$n[y#3s,:<3#+t+c!ZOZo{1r&ڒaICkcCHɞ)#zRs=sGB1ُNߧD**QEޣJ#kV M\e# DKL$#=Y| zs&[ro&| ~nGoC'0 <4 @,FĂi6AցIRL`]pbU?zz(C$70ϓ*F=o? 9{jxfG5C$EDB Uz ?Gj\yBf9$uaZ:ŕӠAs)82$<3,bM\3!@thx5O*gmx#^J&_fWeiR(?q  S;[9$Y?n c4^u"x~Oɞ5oee撆/ HJa[Qa8"Z.{mRU H.+mLxbj&M 6 -B!27^Wso%Hh^:kzYMQFnCp q&4io(KV;6y"d?SE Ni(5v:F45I7BLb*Hd, DN)<d^Wf.rφQ!} lZt O+dTB_f4_I|iq"?}}p)J/Q{oYeρTٜwᰴ&JmzohFUav񬟄΂aA 潎p2jYm?8g ve12Ӥ~hlZ\æ1]>[<SS̚:Z0޳kU5-Kq#ͯv}BA9z|RjxuKSg}߳w(n1eTnр7+4p3Ƀ& 9)1|eE XM#9U8:3 8IQOhH6N-:ғS{5pɬ^T4'"GS \DqJ<)%MZϐzxYUHمiO5 ,1W2Qf/8TI-F{VH\ 4wA;)Alqd2&)v+IT =Xx$:Kk5gi :!ĴºF>NLjJPA+'sޘ) J|hG3>BzB ]+o4 iu,+XHģJVcu,V\PC@@9dSLvڹxM4D߭^B"%d狘/|>Gb5~bA*.F+~䓍YE/x@“ܳ۔.[!/tn=჋hw.7 R`eQ*g8qdRP`; -56X_=E>>7M=3`9pV rSZ%W 2?ycz:gSf  {Er5lOoE>dXԜh}y -F% 'K'Kc^Z\^`r'LLhK z s>dԀ#[V}A!:s*s'bR9/;W;שt"iWqN]K(jNK@a% <7P-n*!4;JLJI_ޠ?~TzuğȷdFɧ+|rFcr=I& aq- bR 9^67"7 a40!V CSqeXNS9&7ѕJwSU #xDJt*`ĂvkN{I%R~Q'd^FU2-u:P,A/ UJiG<7 {88&JU+VQ`kʚ׆YT6,W0YU NՉ'|"ژL=xU ljXR )[PJ]j=4:rKx#<:dA(X nA#C_L]zZOT۲oAt7; OsPT.AoSt).Vk )#"tQȈh4!rLwX^Vvq>8dI=7Oz;_':Ӣlw3v.kmBM5d !k h3*O?] Bd&{\qHc>"GE$(Eh [J@y"16@\ ]TPNnCDcMT qL&I&]}0s@g@\S@n é>MB|\x[o|gDp gރ [Dm$p!P!gdZ7o\j^2LJJ"ٱ6JߎijSSrQ% jq Wv=9v*gBASY*-ag-}+cn4zϭO[Е.D~[f _&I+.iI#Җ &1yV{.q͞51|m/u f/c)>Qć[̨Y^+_D-Sa4rltPw 1~.,oyC!u40y(à q%NAiT*d<]i BЩ6( 8'QWnw} ]?,(.Nu"HHUnx= f}J"[O9ẓځp}zfªE q7PKU0NV er1!"=}P*MS@e`5 woѬK5FcNsԵN7NK|gnYC015hrPͩ\i2CaVPhy*<XDqP>Cpu2v^=NUලG-`˳oy!ineV?]&Q\1x?OA0"@{ę!1 $i'RY!deZ֡ !ˉu% {h)lkhkgȁګCv6>w]5q}zNzʶ EKӑc ^h2]'5غ[GnCL8kF{p')MZ=VT!NʞHD<1zX(Ib;eFClbnvPY}O۸zQCdLs1 i9gf ꝁza "6;@n9)W] a6 V2֚Y i]ЛN~oQxXK %p uSn`@nȭ5=6H2ɖ]Ƞ>4Q˺k q2Q$L G/ W@-S_2zV!wd6pOB9uUb봠D<))L:Ҫa˫6h a WF"y7*<.WV D\uI ]2kfch9ZNRh|ܗG ۋ{>1¬eI w$=Bk1F/m,j aAۓգm*"+Fqx.A*3mU7h_mt\6o3`CTb)H4!͢FY!GY5h}M [KP0kI1mDD6m^q] ?]a V864A}fSPp3wT8ivR(@8;$Z-rT.QӬ"E%)*/TgF?,)Yܪ-/C%I!5& Tc$+QJ!492Jvؕ2ĶRGO-i͵+ʈ$$2-H*gU$.~Z')ѢH,ہRҐP Cΰ~]Ϯj*zGZY/`_p ɸلJ –'B$H 4"F> O6 {kIuW}l-׹[ {ik%$ò!3~UuWtȬ*_wWYQ9"oa'7ogTHLy8VDFcI$x͍f،&K0P7c9_ztE?E~ۻ=TE\Lri E=;+B{]c 蝹pe1{Ջ(hct)^ٕ#ܜECt/1";{*DLQ/02|hN/p^>Y`;٬6Y+s4S@%VDY,x1f݂j=&'or+ 4y779nB߼w]91%?4%C8MO'htedr|8Ƨ?tty!MqU,6B I t*դfMN=Ο~νٓFg|-o i`|HD58P@d Asӥy(Ѵ{ _Iy qUa'l18_lMy;$HjDEdb kC!irb<0HwܺhB>]S?sȚO_մA{+IbE=wJT#gc>+,i-T'&%ϵ`A+0mZ[[qvi!97d(2:K$|W9JYC* ;c[!R>N{W;,R0±A̢_[V>e _-)T x= ʂ]n0I߫Ŷ >;^k+|ތ߿YW||UxO3r; i4{WOg_әIljbȆYkLd3;2(Q%_%!1MSٵ[o+Ct=ztPP ŹNc̣dzrv)V?U8]j7du2My<*ٛ/~sd,ZL})!Qd TZy{2oĜ9JZ[cV1D1>$,2ZR |oL3& ղ(gòRQrC9y"X)Ė|/EVQL2'i2) R}R(kB2Q?t2rDoZGb,p_DcSJmG:>JӵEeMWZMxr4&s=)TDʽoC'G k-u AQv[ V?{yL9ljI#4`!0F%Bd {SY x{zD~Ԭps ʒ#xt(%N+F4yxPrI5Gɔr݈1t-B:GzC@E"8<j H9e.D@\ &fhWQ0-R;2H9hi]qߡV8{ApP+qzp$ѢD"/e$4ȼ$|& ;J#SBx4ą`,Dgke͓ithTN<W6@(2hdvI52ܠAU*p:Q3kx@FDJ /Pn6DG2J;h٭MO@ϸic:$Җ\IBMiS֭OuFʈĉa[喛-FKFQn7E1.WR2q{Vz͊U:%N ,A$]i`3"%Ϡ;ieÈ/-#)}z6Bp1EǸ$] F)Ȇ0Pz6Ljc)X%ˏoe9FԦ%a/BW􁡴%o~rp=p<\_Tgp.4+?t[ _ 8sRRSs=0C/ w=%n\1{صD ף6ӀD} 2v"5_ JF8QBŏ_(8)/ÿv,ю[V_} C57L-~ܓxCxRK̀Qv-ěLa9rDes0MHYPZQP%?CIPWr=SZvN(BBNt")C)vu%!+vuǯ,CAf tzbrK (lX7fy$$o! J28|^OAaN.wV\(=EmwJCYۃhY݀4 S9ʐ ^2x^w_mH"%"t#g[Zcf0~|+D#>Y:p6#|) aq([Z!~g}k2A:8t%2%4q9quZsѡNJt4c1 @6htFwak:d f/[bcOmc$#΁uOyS*UeFYG>ތ!:ˬ 6t8f"RCG R+6l0D&񢦄EhhK. ̑Q Vh: 1#7A\~a?E3߮n‘G~NQ}"]Z'LO.Qqj,i,#Qװ@KV i4#jZ![@?YopZ>]Okv+Pܿ,q:OT:>z{V0{$ %[noƛ'!{1Ĕ[L'p3M{?!{@ՎF;b=\kt{eXlCmS.fyRs+gc SJuX )!Kh"W˰!rŬADS.*t]XQwqYX0l dG9rx(o2fK..YU.u0px(k eV;ً'2cbؗn|V@`+mSm˔u">LT{,)gk)£lXbchy+jvOvj->q͢$FUʄkJdZ `\tȥ"k\9o IeO 9WmK46lK ,W;S0ZR3r88YNgU3!: /UWzyDQcr-fn#/ӗk:Ѩݰ:-3JT:椭Y9˙;FPA5*6u9X[GVa6-T۪ J`?eTH ]G6+jeNw"1$j_ 571c3,hWF>(YWghdU@uQ;dt&iFԮ}_9?iӏ9 Y(?aoHLD@"L$"Lj=j1Q4t&: \qԁmsGROloam7{\*B"Wj jXp\N}GoJ;Lt:O>|G^Ͽ D{g;H]®GPj\ʏ!KYm߭:K"!v†+ncHRq4gF=!!,GQמ$/4W;( -)V)zp?L,!H\i6_Ɇ-Te?O0- Ô !JH E֞. ZݚBkِ9seNs/lRzV[T]n["Yze^p/\:Se{Kd,ILom*VUF~ Qx6:\~en<64o ­ I 2*uFv0JXn81CCWs&: +q_PNFK3e{NwEHߢ޴*Zm^N$.s=՚=c߇←Ox_IFȶڹ獊j-Nк:#aC 0⎍ἂ$EFENMKTDT>SBQ_8Z@Uً4O phƓϳ~vdRB:aET@%3T +QjD}7cّK tbYnJeCښ9G.Cj?TȵE2q}D%|ڏ`LbGqcRҞ(S˧bk&g6{3q@eO{Y lbT'jR BZ/U'DSz/{rދ2Ri, U iN&F 1,s ʍ8ó/9*Bt› a R.l`_>u oyuSĿt2ϯrZvʏtVj 'r5e{% tB [+ghړ]H'̆YV$RxIkWTLsXhɞ;DߤulOp\Ij%ϱVl&>h֩_pPER|zr0E*NɢͳqC?FϫU G桹aJOGWm3#7]D;wwԯzkIr1FvQic])T1 3dkh՛lx)΅mZ9pԏއ*9-UƗBrmkkgRg# UOYnsṋ÷ǁȣ[</<c(9S>$u =#')kax6`8ar_JkAvz73['޹{.nP0# B)()!pTT)pha=ft1M/L|6a'Ttgě'5_< gg0ۏD-{c1Gn-<\q*-&Z9(->ӕm[F^:R O3M&|9Ux<}6$aӥ{P5YQHl}E͜4ĜI4B5#NzphT)R,X}.jgR/[N5%̫Yj`$hW$ɔWwi(jr : BS e C2*P5T[kYBVEFEдMWSc @Oܱmi͡(et/#9*I&%,CaVr]jalˢs`PFɈPRQZƂ59o{Ws3E;wֿͭƒC^kIEtaƙLW\Wpm蜹5-͒DKg%#Za] xP"b6"/dlyI hu$F0*yx wO '+$^dh731G=tsT]e3Hhcg~޲tvY׻^ N17㒶Fel?u=&#J++B2vTT[.–z?Xbc=h?:w 5l.0V`AQLX5e}Q;W2`5|'t5A(5ױwX=}ԱSg}TN=YBPC߷;nyd4 +/ӁA xm;>p^mi-OjQtl(eOP'C0ehwbUL$>UPXLG]BzSC}g7QY?1}pƬRMc) ݅؄REƊ2F5k+LuVD^8Vކ:,+:cfK95s%N-v~Zj`r^DdYa?_|dkˑ~?sXē"G^0aɪyӍI@œgtg_>Pz;ɽO:9et}cTK"BJrFX*@-eνc/933A{tfL.DE: aC Ϗ}-W |W6y{}XBe"Պ؍y}X)(  -#Sz<Q"jy |یNًmwQYz4mHQsWnXn0 ;谳 3] K޷Bh(sAj()HE٫5NejpZC!E+׻ gfb]OmZj4[ &hDM I9VPerc VԚb8pkhyoTzI~?dtLjp V@-R='1f*28 XxʭeQb] UZSG!kK23>mܹHgwNCl@ ܾb'2j_=|:[_TH?b's~KOzm1RK4m/8~si SMfҡFx@_usae J+OqVG? 'ođf}lRgm4ʟ[[h]3/kG߿^*V^o0k)jiZp?]|],|}qYR(*0i\T.`BEmPv;Z: *]t~ŗG1hS:nHUs"0ISȔ ǩBZ;!M`A/|3}>x~B7ƘlL⓽Y;t~:Epa'O!(JUpf9xtME2) i_!nMWA^o݋_chNwVVzg12S.d,L:T:B.\yh[0_9^f)q!z8ecAltf+U{RF#lvchEJ&;q{?ݭ!(jr}Ց4kPM5.E Z%jeKE2E=WJH6{&+Ĩ!zŃeqz3bS7=Ip6@Q!pc“*V pi<'"H׮SҺ(o80ldwmI_!e'Q} =f`|H ψL*$$;ՏDNVdݿL%_@;I!1uy F&Wb4AF c邬pI"D{Dn_ ϱFP s/xeH`Md玅dQsЊ]u;eJY;٩ғ ͑%"ӽ\eR ezcpu0 +r94q2G2!3&;D\f$6:EԎLENTT>K$#>"l!ϿFށiC7=N ηw}V;zlZ:Eܚy2Q!h^ r(/k Dm* נDA-od\+Ꮩ{j٣ILrR9|(-YZbcXj, w7W"dzϾհ,/\6rQ.%<.P)\WyśΫZ~m\Vg/瓥E?uJG9y~5'/|Ԡ1KLtZ]thɓY g%M[Z{:5Z/+2^]lN _B\ćU/&lU*t,x2`O)г^zO*m*}bVH5";&/^__e4' "d@u[;)_aH/.ܖ{=?!6.'V3>/^9l1Nbgvޛ{ъ `}`r;~#$N#vgqQwRޫg6zSYvߤ ~|@ZUqަE?Vo\v_e(w%^sYq-hY7D^}w9Y^ ?3Le;eX.nBp`G*ԥaL- fw=2~iG֮#F핅X4qqq9eHs2N,RA`c豃9z7jՀc|z4ӇD>מ?Y&wnh|&eCza )Y9"hɅ@,טe%L֞k!{T cv{vCoCyވg:6d[WaJeA?' 2Je`i+,Yg#Q0d6%[0#gcsڟtu#GcuwoAPmԧ_+GzVw:)9Gcn8 b}DQ)U2!l8 јIox o0J23IÐCH‰dd\fyb+ϒ6ha$}%G>ʽ- YUbK#P>JΫ eaĺsrz;Y[A*֙hgwQwh9z&:sA>K"9 T֒tX6"}:A,#6y%D^ ,@ \v1%wp]i7cJc6@z^#X, tpٿ0;̝6H'EtND?ʘTu1!F.)#g"3>LNd:Vn;I^DFZ,6EI20BїN+|Pd:`͐2QCEfim"haE1s(R^ cNJ5O0N(c8WWo*q Dew^bV;M ްw_#հ7_dN!>2(G-5q)#ˌ9x꜀+7M.@7$FϠ2q('CE)34 }&ytA !I{ {v x#Xr+NyԠTN#VOatfAg)\r`|6"ã!0"yI$8a8؏Z簝 Q\Ytjb6+QDLDXG_F!$- gpl96vlb{D֠[|H`+G#kFPPy+8Yi>˦~#:c-kb&GH COPJ ɀBΞ !"U@Ȍ5pk?@ wsaA%lkHH~ F̄7Erd/}LJrI^ 2K5dc3sltb1RiJ%Ld)8ly9id2V[iw V(&`;ئP ]wQ$-~[,{ 3%o|wlf;xͥllR7Xb611VwR۞gY9VzΟvʽN֓&bFۂ۪<7׷>`,:(< zJæ֦Mf Iɗ_JKIL@l{+Zd'BBу>{Bi|We`^n͒w˂  jK?nyNl1hꯝc2ߖ?_]7ǛDLg"?y"D~D޹W)Rlh mF,sޅx]+Zm\Ajδi#I+SDa;tB4ەݷlC=Bӓ:-6nMd5LӼ?X8?~38Y&4$}Q*!!:XKNHZ&r`5=HR]jq*.VD0+}yq~Y=<0OoIf3V3dK !XzZ)E";ςB]L`hmgHUo҇ClNlYl(VNL:SdIZ|:co_m'dX W7}3%E ke<-% ]1gv'F&} : js/LD,Ͱͧ'>ҽrRbrᖫ툆{裤3R\1kl 1A:\$|WC7RZݕrTb䒌ܠ=V ܿglO E,Z^?HiDmo}q\235:Im(m!xNj%at peߛ0j6FL߿-T/➋2L=GB=nG8G&vr& A' b>ll}3J3{PYD@5 BpY}HxZ1!Spuc;/5Rbdg^ mbܐt<Fo {g}PC:]Ϲ A####rB z.R9|i$Iȩ؅;alg[Z.)GW^[?=o7KS*d\ A.B.ȩd(x[1Ekhem8x49uiH]:򆳏<1`(-Fb>&|}ѿ1͍;[-eAP VOXTFzCHYT<Hz)tpv`H߭ƺ}7Ղ~7K82MXB0eySOAw4 NhP%rmk{xiu5iVik o%:l(t$yLngY^7`>+Z@=ioGЗv0Xl#_=}؊)R!)XgHqH9E HTsc4T^c>V4.ecD4BC-s/ HadvbDX1y6* -峏k##+jb<ٗBGf4FoG4FB!wQh}aqEM츷g{hd Ny/!_zDZ:Rֻ]E٘T] j_rXV eTsҕl"r6-Lq7VvF_NB3ߗ(tUfl_NEyj|_.z ϺLفw|WԫT@ϧ7jArhEuq:;iUEo̟>LG[Իmb@D/,ےaٺxbwƣNpcDQ5Dߚ%516X ҡj9uRTRI{2i%^3V#%,2dN|i,4g|L%}-1t)ܼp'Ser$j#_v̋_;a'zWp\0,'_q[3$Z#K2) ܯ5&\P j[LSʽ" {%X.Ͻ4gXH5`o5qRz" `LY%D r͕ r1T \+ L<QM 6;JU{vV7я[ imIƍJnٽEv^-"3dfZc$;ƚʤ0o}hnAe6vFt=a[>cs4s }-&ky[9zNeU>Y~(Xn:kj0_~xɻ3}]^-m!bě0f4߼-~W`+< sF!TgZȐi$9vU~zuY=ջ/#) 6)OkD8b`DQApAI"tR|c7eI:#aLj܆.z@V$TNk\ĠcEtUlzJ9Bs+NaN7o5<>@A؄)|:%iZjAIXOȽv6w` .YsδD?xe5R$;Κq9Er鵹IT '5yܜfl ΙgRT…dv,΄H F4(4;iX%Sx`Joӂ ŇOt,UrSq]7כ[]38'/?Ƴ;噈ԟN] jfz=*YuQO>%r03mUE[^*{/jQ&Sg~0:*Qqƒ \}p_|I%rj^pcC QA`ᅙ a 1y8`.91s)1sosϙܻ$p|$\'B|mqa4#صBRj8kq6dQ&ujM`={ o<%,{_풣 S8m`^ Q'Z>PM%5s5FɉXg. \HYznJ޽,x]W?BKwN콜ja΍١;7MFh9MO9j@f1#T׍1%U1 ~1Rmo٪#OkMd_t؟®hR{s[04ᓭarډe^VAQش(כ_ </Gfz39aV.WsI̶b g%cjJ+F;Ҟrx~r|IU邼hST e7ӍQb(60D9LqY + ImQZa_(gףYbنYnG&SE ,K=]ҦTK&=@HIt]!NS/ B/XG6Jd EAzglOW2Ŕ8dI{G \pNJOuc鍞Xa-)#dz^bTQX+1,»EtisyWF=Y& Fp_| |xn{26餟KQ"c5h6$j2*UZjΣmV_eGND!a4!m ;*V CTjTqD%b >/|LS<#Lٙką$jtP =XFѱOy'){ 6NvE-)ѬKfNHodx \yYF?ݛn[$;Θ6+c'x6+V;i*}'iͫ;)B-ԀSš|ra[zp.6M(#JƭQ95j"fA{J.˰UZ).n+:'RxX:SByXǙFȀs 1# jRuke5}*pMhQw#m]-IJݩzV4 q8`%h=Vje 2JvBSID% Y,IKWZ ԘD^oRԇjg$A8`ӊ= HgTqr|PKI$qhd{.]2ÈQ,I {GQ| R(j$crIϕrrցPJ򄴀 !XzN1窼BMب< VajQ"S/[]# XM?đWq^򇢚v"+tcKjM8iO%^n}Ǧ=p7#Akp&2𔧴^(̲2_xӀa`6NYN LeFg)=IoʲP# ELUit_)AA}Gpni%N߷vkhvBBrm#SRp&=~Yh4H﨣Э*2vkhvBBrm,S +j:8@\2.;;6O- {R,y6tk0=Z8a:x5V'J1/U\7rDqF.j`Rz's%30-rD,ׁ3y^P~֦UpdEuSrnbe-/y!}_2paIA?~c6D9wF=]D'ʹd?R \߀jBDdfd!g'pCDJ4fz:X[T,/FXj><9'} 70lE_oSxVDl"%彧"‘vYڷ?rEܽbvG1(m? Wxek(c60Jn}np,&_M~.@kc@4S?7ݿj4z/n"#ZfяWbQc*fŎ57.ͣK8[ŏ7h4+3-h{ţ5~%oQ8 Bܧ-XeZUX`ώKcl)$>d\eYӆ&]&wX7-X|A-Pu6 0 kjBiZqnuy<"q0rpYP(ǎ%B;# H^ "B]{Zɖ;$DmXZ~xIߵQg%Kz)vHbAc´m{XjEt8{X.ޖKDSYtPsrtgE{Ng(wH ڀd[≢J(&XH,8y~bMc>UR:m!ܨ/&wIT=,QlXnŔJ0 B{pmB1~tDfM )4i$[S7VMx$dnʬyO `Ulp >[UR`|r`-mxF=NLz2u:-g<`xǺC=U<^DNIUK?,~0C^G * UO;jUkźix F_f≁l)NٹE'bLj iok@+#Jv:ES.qLERK6DP.wB9ʬk-9i isٽw <l߁k}|'\:dsE7˃ Z dy`#[Tpu܋H;nڲ%CH$KPңMWMjů¡d$&i$lK'TnWkRnixث_O1;iKTfs0@6H 춒 -ǕnQX@>&݇ /./^Jhom׌]~4[Xu&ž0g; I5%NHw=XS&<1mQ晖**9czfMy^OoĴH ƣF<℠J*?oigiD$X&8l|o!%2 R&c]2 / dYar,/ШCZۻRz$[G-jZ痞,ΩQ1:Kq~9Qx?l]Ԗ{.zHЧt߮)ݏ 'O;3a[msǩG_{ۧ1Jǩ^EaA™+EaJyRVdok)ވPS62h,+0Q9k ")GNF.D4B_m^z =!L(;mq>JG<*I,ākU^ѰʎApgtV1B &nd4iju{!|*7`Juڍ¶zwrPHޝu!D{nɺւwZ9m6 9n8Z?9c10LiIZmׯ0h[CDicz+nTQܢ!G;G;-ߖ-hhiѲ{sΓG/,XAS䶗]9dm4-)C'U+KΈڤiJB6|LSdP,+01񢔃4%?ǹWY@] , 7120.okc4@\9}>s?^r*2ީ)@;к6\xrxG|3osܶA<M/1_D5gXzqBHN7Dаy-NowGx=9iB3NT#N}cA  l0N8xO;3&M>_x3OTN5BMnV'j M •}BR(E, ~k{ sBٟ-"QWϴъy&J֢eOOA@`NAtj @Ѻmtv>a*+$gNʕfydMְqm%:k`( t5|󔻾cqɅ۸0Y[/VW]4cx)d+YE2l6bmƃ%m'h=;SRV͕2Di@$ȶxĈMZf (m:A3b%@Ve梉 >ً5-&PGn' ֖lJSY7M ||0KH2rVȘ+O4WXoy@.TM^YYA H; {Z k&j@ n"ܟ﮲Y\} CCLd4{ODf3A}iN/KvB+݈a; y#_:\t=A:-?G5ϲ%C;߁ {l'Bzݶ/?^^~5?8|@nI:jyD0̿z躻}{IT Gu_<;)tc!PGF*D]Q T` '3^a~'|וVC;v_ ;)) cZa>^/;I)+lt Ît(0&k,y׉-ZF (. pgk.T{`NciޖF)1{w֝_V )XׇpS8ka+uwxйxC5|V5|V5|V5|n!}ުzg2Iޢ\)K"-=Hog͇~hP{qӕ9u_PnB# ayE݌h0>SނO%$^o9QL+\@z\pZ50*dg&pYvnh۩GO5=cA (=#=6<a7mx>lz(GؽR %s WB5˔~zS)#H3] 3ǵf9Е`;y]ymp7OhǮs~C[)ϩ 4H=j/񵘒#yC ҈C&˼] 8Z{n,6hȡsBU!ϪgUȳZMֻ*wŤIJs%emfOs3j`~|M?o 7y#廖_sv*[%0] {Ȥkk3Qj:_}tnHG=$poAFǁT{N=##j7ط cȬMtV\;$*3r>ʹlaG>S6 6R[٧`2Nht"K HA{o"{`ҙǚ#K-_g9of~yi_&S7D|0t.)H!r-X$;r㢒B3ޒ'9w%O6'Os;2>iG\/\Ny%WyЋ/yQt*ptUB~@K,[s|c _i#sX>?\(l^Xo`k?ٻBג$?k{Mwljv&s̅)Y$:g:d;?~!I c[oU0b"8D"9rGR/"oL63 &gKHgQr2>1ΥHao6i0ANkxICJKDHu,$TU8kG. 9$VM;عU |hJezfvVYntd:' 42mfHHVVwUh(7CaA23 ezCfwyQ&_ ݬsQIf"?g4L& F0E^Ω ' Qw*ُ̩c39I10 a4}'gMw_dT&ml UA'Ӣ0yZی"MˋUݙPɳYgꍏlfi>r!+90XX "LG5@UܦT#"7.fxo,%7FXKgۘBYi"01Vg+XaCt{(+0$*!H\@$ch1;&#3:;P2xUbI<QZ 9WGSZ-ëKz(L fVGU?j`I@<8iu-@n3^$Q*̴q\f!EP M;#l'P7'͞9>ܰm3&,ޓUq@=0=P >X 0 hO3"N& y\f}QIZrLG n:4IAMg]ϙq*]FMٲ8g;Fa#ڗ=JV$3YtlM d 2ђItƎeBgjRkeHě9E' ղmr(}6oه"{٦O0˥|ɬ+`!Ȥ6dpcء\AIurhyz̤4YⅸbQկR&'Ʃ21v4pC{Ⱥr_7|}?R腣}M,95V&?{V,-Yl $,ЃnexȒί*IDV|:rcۖD_U.<g" .OQه +h"0Z1@tؓ98-n]NrA9:@t]Z=`%5ANFFI:Z0п89t/Ulam)ZhR$M "@T_ ?gt&@հJ܄MX8ȼIc|: ?G8TF#NM2-&oyřr :JMt@cEpV,5mAwS X*4/)ZG0Pr^##L ]$M/51nlTPy{vRff~q]Չd&; K֐@zJc@IQnUsJ7Du@"*Eֺc`[t:q ]_],0zpA dVTlm0Sc*SΥC3)bTrB}Дoo;-/jwm8"& y`y>Y샋.ƕ !bX8gچȼe5 r)h@ sjKq g7#syD/NB*˱H1c 04h^ "{;#MIFnu{[| ofPy<|Ti:śgdtjW=nA ChX|^HgTa(EI7hhܹU2d؎qeqU͹>N(o6l4DnGB2%#YU}Yޅ$~qqrqƓ9sI5"wt}e5Ї Z^#o~pĂPc1 17Wy6!XJ T(,Eeez-Dg]g}lm! 3SPf.ʜBFe5ˬrZn unM&ri( ru(6TT*J+|*cQBkc砂~8:0ߣxޜҨC=ML3efѬ.n/l I\k="L. gI)i!d %[$`AuJut,ݤp4L]oBV4-YԜ"gI#yQj[S(b< wAeDWYK{3Q+MiMi9p+SF~Y{dVި<^R,P+h/h}ny?mTCڨ έk7 wTdUy,qRdeA%5κ~YDhaV0܎Wmq7Aqq=ue3 hO֮,-ZbW}](fqg> zdy^MArlDᵷOݓ=!uNz14aa]C"N=D3ae~p.oSŮ鈺(<M7*hΥn퐦<}7gAYlCt]%Q 2 \7(}d]K`iȔHvfGLڞr7sXV7Hd;hYGJ:?Xl<&4XIK^j\*QhRJ(W5*_>;4A 6t;,fدs(u;ͧ{yla~^ ' p}|WSSv*zu]O_'ی+{o~+|AYD};wo+IpWQ/Ӂq |f,"NkcSRs?jⶔNEJ]L6(-??zhu0G-yRs*j/lغs/m~e(SztpkNz{t3-De>lUպZl_yBٜ[;> !9YG5,4N:;YQXS;fzeZbDsUNaCa&* 6Dikf`$t2L|En/l­D"5=V.7r&!}^.Lryې@(%v .I'I'a?iIs(J*-;a/ ŮYn\'vrq7>o%NnsN@DOEiIkg U>'xO-hy9?|fU^oo=߇aO>T{ٯdzҮ:6di5~B ~Fc/q0͚CQ^QD;Γn A t|Zz1 w5j@JJw?kp$43E 6'2ԣRKO<Ղ}{R'QPfVbi+0F&w,pR8W*m ,D+*[eu#aoz5K4/) V]W[RV=*1f WֿFCC 7u94H >ػfzc8iHs+ޏCRhRz(-kқ,5X 't]~_HwG-dNi{UD6Fk&* Sh2lRY?^ [_GV E{| !<`SKPQ)T{g,2_μ3XaZA^v}\:'SQB$n9C1[5)(+o=P li>a8=]W@cY;K=5\6fHI#F3i21fp[l?\O{-|*e*115S p9f!p&@C͢<]5q)OZ}5`:tJ:tc0zYK NQ$!m=ŜEËƌe+ XU\/rcX]^9r|_o3CʖT6nR1-ck6+j*{q 7M8*:?V_b\%D)0~s6n(:ԓ:F͊'*hi3?ՇTa~[C ,NԷT.P W)}RjzJmp9jniO~3)3aRt_5"u;@seL/(2^r6k4ƞF5?0Cɡ Wvs<=j7YYg4P͜iEjD[ƨ "Rp2/{RHr E[nӿ_le*눦j3ϑI~DilϘ\N*|9G>yT-Mf(}M'?k,}7M8q53n+Z8ibNnQQ=ژ- j_KiŜ)orqZ٥vj̙6U`ML- (G)$*W#2 ˟TG@oPعt)py<ۦ5BhJʇ6DE1F7\uSV+~KQR(M`3KFiBqk}/?nC4T;C6kb xTϝFP[wƭ6̽w m(Sf@ͨ.R{wR+[ ڻjk]\h=Af*RaHQ3eVɄcB>qs- yy|/ۛЪjFn?|UT.P #W)}R{JI6 ݜ^Ȝꀽ2Fuԑ)UsYdK/2^μLVφxtko:FHpIjKS9ө @%OZ锵@)"<@|Ǧ_`Vspw^idDi wLbm(ooL'(#*?Q긠ӘUo_F[sts<_n@InԳMιӻ>0M{>u^K ^\(k\d7r]*\mZz鸷Z*Ԇ\ͪUp=gv\(M&;.N_1(ũB M7պ5-JJ=ZϖKt*T{hٻ4's#S4; 6)ꥌol6ֻGʪe2-|%͘D3V9ga#y(F~D-^׳ީ үgbAo3H{]w錌 "]kEizF'[HUYD*{:azMmNu:v; RzAp4lH@0piL m^s^^zR-z[K9 X22Ү\=.]K1j)Řz{Vb$poh2nK?2\6y{yƃv s:ϙ_xZH;#^ܗ9 \= 7f3sϚxA)rgez\'gETgz1Fvܲ`&%OZw|/E7|҈}Fԟk*TpRAU5: ag/gQϳ\n)>45. K)z yKE}?cs8:M;}P聝O?7 m.WJ5n=2V ,i&':=e{Ʀ W31w݋qcI ɪ=D͎.߻؄̵0YKjM)}t6Xp7e5|WLJG[u`![djMc=nH<`hUHW<(Å,{B\ߋ@~.SKT^/–+i#9psǗC "Yt}4J.uhfVd6Y~qKu XPQE{/[Q{ h^z7zQmO8 m/wQ8 9T,٪]˵?N/fGr[hӕ }FSF}p, s>i9.¨K3`DHFO&~(6FҔG$ԀP-3>H2ŧl\7fCv´܄O,l=Mbl*Eb@q!Oѫ-t%tn>n`xQޢDH.yR?pu}˰fՇT{ _vX=OӫiNeIAR@*F]1 o+_Z<2tjdodC{d)q(F%95cL $CphkOX U|FHuݘR;)^!@90^=˕|_ظ>O_;‘q1r Ӂ2qoչ|Fjy~7dZk r6UJg-[n9P \7 sb@UTi#s>ऄO+Y!I97Һ܃Tmqb{{PB5Dy*A!V&"1%cZ [ˣΆx8Y|Bt9n1W3`v idn0hwvN*2[Uȭj..gTӽ% }ԍ=kRJ3 J7/YtQ3an3!%J 79\A &3WIr Ym9dZYq'rNt|Y9%v|ഭ9bID"?[c_)EsY/tQmJ8ZW2/1%nݹy9(+´53,TDBRt-{[>:ca %-?93 TVIZ)X˂Aa5j?Xܒ h1SϺ;TQ?%^Ƙ{LfG8U,7(,''Ga<命__n/ b$7If@&Og|>Yjl4Z3ŸJ,TQ|dE7vMxnX`W>\wZ4g{:ٸ|wI\ċڼHv 'Yϯ^}G-kEj8ӌ_ss15qMv֘vMHc^Eq qjӲ^;/5PFw1M.89+t(qd]m֡|W^[}b5Kq㓼~ztNu"*D~ o!=U{|:>(w9\pF$G룜waM?>+!o@<>k0 9`L~⠛(oET&@V8VFnmqy1ڠws*.c%yEIպ2ԯbtA4Ղw|WwGB'q8ب+Л2ںТwZ؛DYJ`! FKCN! Dǚl!3Sq#nT[^⛳>ۑߊ0n4jE%c+ҘPzO*J?;Y?;Y?;Y?;M>ꭱ{%s^I 2PCy+=1ȷx\K%ֿ3?oj) ŅUO^0j-A1:s6T濌wFc~~B;H`MOT3fNFy(YG?z߾N̤Ď,>Yţ qu[BTR͈1~D Xj¤J zR 6JEO` *x.iF1ݵ%pL۹Z?Ǘ/\a:Ӓ!nn$@]Y7?r5K6aLSvzrh€̰ ֏~^dC|yO UFOЪh*syWh^vp.hМɣֹG/Dx 1 8Vƽ7".T+ ga 2G1Н7w*>զf:Y~|pm7F"\ ="[=[mvgg_5 0ѫyIw^L`(k$VdBOVd—G]"i^079CMμ]Azl]|`eU|3ע Z-ۿF;8ϷlDD(&sD`:E5N.w?{Y:n>u{7ߙ[~D U3&{ש8>`éG'1o弶xw8Ӧ\f: @@G&N[/2 Ius7[^'ϯuӜTWgwҜK1 Uӟ~ V L$"Wyum#Z?|_;mthp*25WFzoךuՓQnoWOgF<>{(\AD)Zz?էT堥i)QFcj)eZ:hvҽR.ʴ3>ZETS3~keZ*h[HBl]񼎳g=8S{:Kl.Zy%K-ZoEZ#P"yg@-׹F(AtS\^yP$x"ҍ9owmOAD2KHlAȃI%yfRpJ`aJYV6vw3rvQ=09by9h_ZJ&rvr y`$(zM/'T9>hy\ DRNVNVNVNS5ѓc*cK"h^q,^[FS` M&QCeʇҊ򕵙RmLoחxx)ܰ:gИLb0eTu!JB'g k$J%-D:ũA-:,zgPfjA!pOAĸU%"&FI QibDT`I *FYl Mf|5.]5n\T!-!5 1 <0T@׼&/AYωv"5RX+#ׁQiٵyMx0.8(h~Ҧ hɦ )TK (yI˚6NPoO`@e*k0$oώ\KtmR  )rQ6AW,ҡ::qϝ$ DlH.M)YS 1 ơүNu>pߏYZj m 7X,*)p<1FI,PNk;%r1cH29DJuP3ƘL4AJŐ* WY`rbi>&ƠhuU Bג T1&d"8PYR拃 jx8,.''3GE}+YuUdLrnrwVˣ|=2/6'\x*Oǽўyp4(Paͷ [\2鋫)qǓߘsR,ft s.ý<hUX,` @t8eLrAJVCnQ*S%^Vc#donm^P(=(*P샗P~P_+i;6&:JpIwyd|*g_g8t`EIZƮTLݦwd"޴$"~uulm zQ%K&G; Է2nq|v̈́5/ u8ʁ2%圀!Uڷ<zͥ גns5iahz!ZXjIoJ($vUT $]2u1"lk<ڲ˟$\Z;.,^~x6kFYdR4K28DQ3Nflv{2Cv]ui_RT,w]y˿}=Y|x}o\,fm: x@mt]{Өmble0,++6F{hrԓp)#~ݔru@qwDJ2ѦFn'ѭy.Sїvm0:HnӕnM-[]4ç"TC6e(ejT%J@?BN>j)}0XLݨ 4 t%>)Agє++fZdR;u8QZgKAYLazgOULs%Zۑ2~(I}MzQB[3<< {YàEδKN,,=htRTCUњ{k/PuuU* tX:,twzh00haL3_nsL3Q| iJc-{㾭f5ȬT43il182 3\JbEC\Iw'b`KL[yJ{|2dgrdDf^)oW3V`9]uy=Ҿۑ[ 8:u/\,r2BnA1K yj8Q҇kۚ2ىXɦ0IC`]z4:69IPBvoѻ"qH=6mnX׷1[og.(U^vX)E?}•37_d *f4Z!,vg;~D yUL]T9^F[ Lce][ aT^ âeXUq6ǴjnCPXlhcG?yQ'b0Hf#Dk'N!n͘0:7 qY&(cqJnZ<< J&;L6Egkfxh'ɞ[GL G!=Lp 6 ( mWcɻ-np(,sߑrx!ڴE79R!/E|#|1y'b0:HnγCG|*kbtC^n"mz!׳zCJ0'hۣ⥷uW92%`ow%KVލU9fjw%F_%^ERK~,srFmDgi|*| "jzoKQHX"k'sen󍛭4{t9Rq.C3 {zD^Vӗ;] Kn%ףQ`CxaxϱTh($֢@Xpfj/YㅐRRNlbLkt}Ǧ ðة 6frUK; T3I02Q[Q/塾?NpV7Bs=:QVcb6JM'N3P-p>"K/nKy$TkP1G*㡚JU.(<>Т[ݓݣbTL }dk&xV)~3&ď:BN)r̛֮ &oW ͮ Z~[1BkL߉w߭AF,//}~{*r{s>ߞwv 7Y*}ߕ40ҧ'˛ŚD <ΆK.>{zrun//ã?U00zPHt*еv=T(E .rag/ 14ZwڏKG6-lRF"V!\wtKsV9qa5E^X$E=KݥFx-s@g>Xn[jk8; L X3ߑ<m*a<܃p)dEJ0J0 {Q %8Wi[]ҎA7J!X9n4 ]`]'e:ڕ;%^|Qp9k ]wbq} f¿ 7VEdO68[w~Erۛ[\5EuU6CJZޝV^=dkjM+YUCӢGZv4ͦ-Sb;>Y]e?oWr)5W<^534o2#aI4QM ;vM$ӌmƩ 1#ġ Fd}:O|[‚VB48^DA_!2OBp^HEa~47[Gap$k8˦C45dxmIsiDjTFbYب/훰ձ!oy=XB07_kmE=a81JK-U4\Ita@: f<)KuSֆLUmʪ 3q1fdSfgXA`=򊸷,< I;"&h_Ei˺^6BynL7ơ"b?|\%I1hE*7|!7Rⲙoʐ"ff煰Ji+|ͨ(-*!QtDna]MC6]I7kг[4Pj%;UEQP֍uPXgiR^7: HWuVay86l#RC8q| 3d)+Ҁf`+Ab?ϣGz@2o7)! NtJA%z:zD^Q\㼪w +6(ŕ=,(\"$@0jB*-FNgRtf3M5r&IZ"ˤZV8QY-! 3ΔB֕!.R[Za)3t(-Ȩɵiû_0)3kqXJϹN:Pk-;A@ !7ޤR\kkҨo#kn/.Qr56qۻq#+%%*2@Oȳ!ނEv @=xtϐ"4Vׅu!)caz&^7Sj=onZ/cm,@ ȢȢ<̺Y0Ӹ9NNOr }&_i&nަ>R҅]x@I؍gK/coAX&~a9Q)#oe feW-QJwPJȀgJ篋 k2k~At{*?稒N12OWH(!qܠcWYdHV lBПe?f'8C;I@Ʌn:4첯$Kem'$j_EVR%e$/t~̽(@PaԘ@ǀ`*1$4~4fFT+޻tVQgdϭs@Tbɗ/fEPZܷ(R"j 1yC)*DSa,<.@qzL}HIц": &e#;D)i˩PhT6cnr3PJa:FG־&ٺ|,w=MWT+j|VnKEX&CN0D؋:hB +v;c4/5\9!zY{xihN޲Bfp% .\M\2kקWͫ,ʷ/4B~'An\/"i{A`[btTQM$W2 7!J!g<4kԀs^wv-* cq V)~bRY=,n)}r>lBO/g_uBXЈmٻ ˨$f+W<*% M(.J |'zA,UbvѪcgjl/m*jc+ҠR\=RhГ־ ZWKm3C'[V!s}^jך79ySu&RL#_Y34:~נ9#FvBt)͙W s튵CMw_@2ީjyJ΋F+aÅx ۔ cB3jGk} i8%w(e  R?88AҰx7f.*.F0zEߓdd>^@( ѻ=|<x0fvQBfznj hĮf0J,Z7@.ՙRZO0~UeX\Ə!!LF֨L?/Ǎ~4}_GE6k4k}@~Zb?j3=yQB 8#Cl*AD:fI2X UKYʓI([\*l`nޔ]}Hx|N.k[EKG1z=j,}kT.ʍceMG3"i{Ci;7WW ̅Ka*#P:UhNV+DŽ)Jc㩜̋/R6 3Xvnѱ)MƳ ":q map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 13:57:01 crc kubenswrapper[4898]: body: Mar 13 13:57:01 crc kubenswrapper[4898]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:14.924981673 +0000 UTC m=+9.926569952,LastTimestamp:2026-03-13 13:56:14.924981673 +0000 UTC m=+9.926569952,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 13:57:01 crc kubenswrapper[4898]: > Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.435929 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b287d795e90 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:14.925102736 +0000 UTC m=+9.926691015,LastTimestamp:2026-03-13 13:56:14.925102736 +0000 UTC m=+9.926691015,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.441169 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 13:57:01 crc kubenswrapper[4898]: &Event{ObjectMeta:{kube-apiserver-crc.189c6b29e5776e3c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 13:57:01 crc kubenswrapper[4898]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 13:57:01 crc kubenswrapper[4898]: Mar 13 13:57:01 crc kubenswrapper[4898]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:20.964773436 +0000 UTC m=+15.966361695,LastTimestamp:2026-03-13 13:56:20.964773436 +0000 UTC m=+15.966361695,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 13:57:01 crc kubenswrapper[4898]: > Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.445021 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b29e5787021 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:20.964839457 +0000 UTC m=+15.966427716,LastTimestamp:2026-03-13 13:56:20.964839457 +0000 UTC m=+15.966427716,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.449254 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c6b29e5776e3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 13:57:01 crc kubenswrapper[4898]: &Event{ObjectMeta:{kube-apiserver-crc.189c6b29e5776e3c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 13:57:01 crc kubenswrapper[4898]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 13:57:01 crc kubenswrapper[4898]: Mar 13 13:57:01 crc kubenswrapper[4898]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:20.964773436 +0000 UTC m=+15.966361695,LastTimestamp:2026-03-13 13:56:20.97679212 +0000 UTC m=+15.978380369,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 13:57:01 crc kubenswrapper[4898]: > Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.454467 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c6b29e5787021\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b29e5787021 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:20.964839457 +0000 UTC m=+15.966427716,LastTimestamp:2026-03-13 13:56:20.976852041 +0000 UTC m=+15.978440300,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.456609 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c6b27062e52a7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b27062e52a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.628728487 +0000 UTC m=+3.630316726,LastTimestamp:2026-03-13 13:56:21.876616881 +0000 UTC m=+16.878205130,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.458491 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c6b271398acd1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b271398acd1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.853802193 +0000 UTC m=+3.855390432,LastTimestamp:2026-03-13 13:56:22.083996564 +0000 UTC m=+17.085584803,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.461500 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c6b27145e2f76\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b27145e2f76 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.86674623 +0000 UTC m=+3.868334469,LastTimestamp:2026-03-13 13:56:22.09242852 +0000 UTC m=+17.094016759,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.466490 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b287d7785a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 13:57:01 crc kubenswrapper[4898]: &Event{ObjectMeta:{kube-controller-manager-crc.189c6b287d7785a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 13:57:01 crc kubenswrapper[4898]: body: Mar 13 13:57:01 crc kubenswrapper[4898]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:14.924981673 +0000 UTC m=+9.926569952,LastTimestamp:2026-03-13 13:56:24.926947356 +0000 UTC m=+19.928535635,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 13:57:01 crc kubenswrapper[4898]: > Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.470348 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b287d795e90\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b287d795e90 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:14.925102736 +0000 UTC m=+9.926691015,LastTimestamp:2026-03-13 13:56:24.927043028 +0000 UTC m=+19.928631307,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.476120 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b287d7785a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 13:57:01 crc kubenswrapper[4898]: &Event{ObjectMeta:{kube-controller-manager-crc.189c6b287d7785a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 13:57:01 crc kubenswrapper[4898]: body: Mar 13 13:57:01 crc kubenswrapper[4898]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:14.924981673 +0000 UTC m=+9.926569952,LastTimestamp:2026-03-13 13:56:34.925598071 +0000 UTC m=+29.927186310,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 13:57:01 crc kubenswrapper[4898]: > Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.480835 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b287d795e90\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b287d795e90 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:14.925102736 +0000 UTC m=+9.926691015,LastTimestamp:2026-03-13 13:56:34.925655033 +0000 UTC m=+29.927243272,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.485379 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b2d25c427ed openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:34.928445421 +0000 UTC m=+29.930033680,LastTimestamp:2026-03-13 13:56:34.928445421 +0000 UTC m=+29.930033680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.489930 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b269da3aa8a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b269da3aa8a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.874811018 +0000 UTC m=+1.876399287,LastTimestamp:2026-03-13 13:56:35.115195469 +0000 UTC m=+30.116783738,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.493782 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b26b020814e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b26b020814e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:07.18498235 +0000 UTC m=+2.186570589,LastTimestamp:2026-03-13 13:56:35.736330494 +0000 UTC m=+30.737918773,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.498948 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b26b0f08c60\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b26b0f08c60 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:07.198616672 +0000 UTC m=+2.200204911,LastTimestamp:2026-03-13 13:56:35.981696086 +0000 UTC m=+30.983284365,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.506122 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b287d7785a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 13:57:01 crc kubenswrapper[4898]: &Event{ObjectMeta:{kube-controller-manager-crc.189c6b287d7785a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 13:57:01 crc kubenswrapper[4898]: body: Mar 13 13:57:01 crc kubenswrapper[4898]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:14.924981673 +0000 UTC m=+9.926569952,LastTimestamp:2026-03-13 13:56:44.925695304 +0000 UTC m=+39.927283573,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 13:57:01 crc kubenswrapper[4898]: > Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.512157 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b287d795e90\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b287d795e90 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:14.925102736 +0000 UTC m=+9.926691015,LastTimestamp:2026-03-13 13:56:44.925830897 +0000 UTC m=+39.927419176,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.517685 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 13:57:01 crc kubenswrapper[4898]: &Event{ObjectMeta:{kube-controller-manager-crc.189c6b31cdbc6f23 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 13 13:57:01 crc kubenswrapper[4898]: body: Mar 13 13:57:01 crc kubenswrapper[4898]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:54.926380835 +0000 UTC m=+49.927969104,LastTimestamp:2026-03-13 13:56:54.926380835 +0000 UTC m=+49.927969104,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 13:57:01 crc kubenswrapper[4898]: > Mar 13 13:57:01 crc kubenswrapper[4898]: I0313 13:57:01.678353 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:01 crc kubenswrapper[4898]: I0313 13:57:01.738690 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:01 crc kubenswrapper[4898]: I0313 13:57:01.740775 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:01 crc kubenswrapper[4898]: I0313 13:57:01.740846 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:01 crc kubenswrapper[4898]: I0313 13:57:01.740867 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:01 crc kubenswrapper[4898]: I0313 13:57:01.741878 4898 scope.go:117] "RemoveContainer" containerID="f9e87fbf50c4eff43d663444f50a95a72aca9900fc77a3d96f9c7cc7f66196c5" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.043282 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.046561 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3"} Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.046739 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.047740 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.047772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.047781 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:02 crc kubenswrapper[4898]: E0313 13:57:02.374452 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.381406 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.383408 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.383549 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.383639 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.383820 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:57:02 crc kubenswrapper[4898]: E0313 13:57:02.391950 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.679758 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.051686 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.052465 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.055317 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" exitCode=255 Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.055357 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3"} Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.055426 4898 scope.go:117] "RemoveContainer" containerID="f9e87fbf50c4eff43d663444f50a95a72aca9900fc77a3d96f9c7cc7f66196c5" Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.055637 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.056957 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.057142 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.057270 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.060408 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 13:57:03 crc kubenswrapper[4898]: E0313 13:57:03.060995 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.679006 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.061626 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.677101 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.925797 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.925941 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.926034 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.926283 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.927842 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.927925 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.927941 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.928551 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.928670 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f" gracePeriod=30 Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.971774 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.972039 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.973541 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.973590 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.973601 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.974242 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 13:57:04 crc kubenswrapper[4898]: E0313 13:57:04.974472 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:57:05 crc kubenswrapper[4898]: I0313 13:57:05.072632 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 13:57:05 crc kubenswrapper[4898]: I0313 13:57:05.075491 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 13:57:05 crc kubenswrapper[4898]: I0313 13:57:05.076701 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f"} Mar 13 13:57:05 crc kubenswrapper[4898]: I0313 13:57:05.076718 4898 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f" exitCode=255 Mar 13 13:57:05 crc kubenswrapper[4898]: I0313 13:57:05.076840 4898 scope.go:117] "RemoveContainer" containerID="11d7141d32caadeaeb1d436b8ea9c8ad8a7002577c20befa44b7c64ad366bee3" Mar 13 13:57:05 crc kubenswrapper[4898]: I0313 13:57:05.677303 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:05 crc kubenswrapper[4898]: E0313 13:57:05.843844 4898 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.085697 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.087723 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"00526d41ebb304bac57e24a91007919427bb623e8cbce6cc25d7b1a5195871a9"} Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.087871 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.089395 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.089454 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.089477 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.399134 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.399447 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.401478 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.401537 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.401553 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.402340 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 13:57:06 crc kubenswrapper[4898]: E0313 13:57:06.402664 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.678531 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:07 crc kubenswrapper[4898]: I0313 13:57:07.090834 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:07 crc kubenswrapper[4898]: I0313 13:57:07.092225 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:07 crc kubenswrapper[4898]: I0313 13:57:07.092270 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:07 crc kubenswrapper[4898]: I0313 13:57:07.092285 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:07 crc kubenswrapper[4898]: I0313 13:57:07.678267 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:08 crc kubenswrapper[4898]: I0313 13:57:08.677213 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:09 crc kubenswrapper[4898]: E0313 13:57:09.382722 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 13:57:09 crc kubenswrapper[4898]: I0313 13:57:09.392884 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:09 crc kubenswrapper[4898]: I0313 13:57:09.394632 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:09 crc kubenswrapper[4898]: I0313 13:57:09.394696 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:09 crc kubenswrapper[4898]: I0313 13:57:09.394715 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:09 crc kubenswrapper[4898]: I0313 13:57:09.394762 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:57:09 crc kubenswrapper[4898]: E0313 13:57:09.402990 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 13:57:09 crc kubenswrapper[4898]: I0313 13:57:09.678230 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:10 crc kubenswrapper[4898]: I0313 13:57:10.676642 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:11 crc kubenswrapper[4898]: I0313 13:57:11.673002 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:11 crc kubenswrapper[4898]: I0313 13:57:11.924860 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:57:11 crc kubenswrapper[4898]: I0313 13:57:11.925067 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:11 crc kubenswrapper[4898]: I0313 13:57:11.926345 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:11 crc kubenswrapper[4898]: I0313 13:57:11.926400 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:11 crc kubenswrapper[4898]: I0313 13:57:11.926422 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:11 crc kubenswrapper[4898]: I0313 13:57:11.929378 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:57:12 crc kubenswrapper[4898]: I0313 13:57:12.104769 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:12 crc kubenswrapper[4898]: I0313 13:57:12.104875 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:57:12 crc kubenswrapper[4898]: I0313 13:57:12.105831 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:12 crc kubenswrapper[4898]: I0313 13:57:12.105891 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:12 crc kubenswrapper[4898]: I0313 13:57:12.105928 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:12 crc kubenswrapper[4898]: I0313 13:57:12.676566 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:13 crc kubenswrapper[4898]: I0313 13:57:13.107088 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:13 crc kubenswrapper[4898]: I0313 13:57:13.108098 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:13 crc kubenswrapper[4898]: I0313 13:57:13.108140 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:13 crc kubenswrapper[4898]: I0313 13:57:13.108156 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:13 crc kubenswrapper[4898]: I0313 13:57:13.676582 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:14 crc kubenswrapper[4898]: I0313 13:57:14.676175 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:15 crc kubenswrapper[4898]: I0313 13:57:15.678535 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:15 crc kubenswrapper[4898]: E0313 13:57:15.845037 4898 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 13:57:16 crc kubenswrapper[4898]: E0313 13:57:16.388026 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 13:57:16 crc kubenswrapper[4898]: I0313 13:57:16.403326 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:16 crc kubenswrapper[4898]: I0313 13:57:16.404871 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:16 crc kubenswrapper[4898]: I0313 13:57:16.405036 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:16 crc kubenswrapper[4898]: I0313 13:57:16.405131 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:16 crc kubenswrapper[4898]: I0313 13:57:16.405262 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:57:16 crc kubenswrapper[4898]: E0313 13:57:16.411805 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 13:57:16 crc kubenswrapper[4898]: I0313 13:57:16.678097 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:17 crc kubenswrapper[4898]: I0313 13:57:17.679150 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:17 crc kubenswrapper[4898]: I0313 13:57:17.738646 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:17 crc kubenswrapper[4898]: I0313 13:57:17.740131 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:17 crc kubenswrapper[4898]: I0313 13:57:17.740519 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:17 crc kubenswrapper[4898]: I0313 13:57:17.740625 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:17 crc kubenswrapper[4898]: I0313 13:57:17.741445 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 13:57:17 crc kubenswrapper[4898]: E0313 13:57:17.741738 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:57:18 crc kubenswrapper[4898]: I0313 13:57:18.612618 4898 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 13:57:18 crc kubenswrapper[4898]: I0313 13:57:18.630449 4898 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 13:57:18 crc kubenswrapper[4898]: I0313 13:57:18.676855 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:19 crc kubenswrapper[4898]: I0313 13:57:19.681156 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:20 crc kubenswrapper[4898]: I0313 13:57:20.679368 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:20 crc kubenswrapper[4898]: I0313 13:57:20.739574 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:20 crc kubenswrapper[4898]: I0313 13:57:20.741362 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:20 crc kubenswrapper[4898]: I0313 13:57:20.741419 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:20 crc kubenswrapper[4898]: I0313 13:57:20.741483 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:21 crc kubenswrapper[4898]: I0313 13:57:21.225149 4898 csr.go:261] certificate signing request csr-lwx2n is approved, waiting to be issued Mar 13 13:57:21 crc kubenswrapper[4898]: I0313 13:57:21.234825 4898 csr.go:257] certificate signing request csr-lwx2n is issued Mar 13 13:57:21 crc kubenswrapper[4898]: I0313 13:57:21.302056 4898 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 13 13:57:21 crc kubenswrapper[4898]: I0313 13:57:21.528498 4898 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 13 13:57:22 crc kubenswrapper[4898]: I0313 13:57:22.237001 4898 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-05 00:25:13.610928412 +0000 UTC Mar 13 13:57:22 crc kubenswrapper[4898]: I0313 13:57:22.237082 4898 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7138h27m51.37385236s for next certificate rotation Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.250658 4898 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.412403 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.415859 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.416002 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.416068 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.416495 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.432682 4898 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.433155 4898 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.433178 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.439405 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.439442 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.439452 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.439471 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.439485 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:23Z","lastTransitionTime":"2026-03-13T13:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.458003 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.463741 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.463782 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.463797 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.463818 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.463834 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:23Z","lastTransitionTime":"2026-03-13T13:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.480491 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.486247 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.486286 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.486297 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.486315 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.486326 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:23Z","lastTransitionTime":"2026-03-13T13:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.500539 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.506089 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.506134 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.506148 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.506198 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.506212 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:23Z","lastTransitionTime":"2026-03-13T13:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.520448 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.520683 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.520738 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.534721 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.535022 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.536552 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.536609 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.536635 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.621287 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.722015 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.822402 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.923453 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.024088 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.125164 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.226152 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.326947 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.427816 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.528516 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.628817 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.729885 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.830810 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.932007 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.033144 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.134019 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.234411 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.335419 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.435580 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.536985 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.637939 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.739059 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.839968 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.846136 4898 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.940994 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.041853 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.142215 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.243277 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.344322 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.445396 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.546294 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.646861 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.748042 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.848528 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.949521 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.051474 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.152062 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.252852 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.353741 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.454056 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.554947 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.656045 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.756457 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.857590 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.957766 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.058817 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.159754 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.260090 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.360296 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.461158 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.561422 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.662338 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.763561 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.864433 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.965528 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.066576 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.167474 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.267996 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.369005 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.470955 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.571481 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.672622 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: I0313 13:57:29.739448 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:29 crc kubenswrapper[4898]: I0313 13:57:29.741563 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:29 crc kubenswrapper[4898]: I0313 13:57:29.741816 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:29 crc kubenswrapper[4898]: I0313 13:57:29.742119 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:29 crc kubenswrapper[4898]: I0313 13:57:29.743357 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.743999 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.773201 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.874588 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.975567 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.075713 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.176513 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.278021 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.379434 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.480196 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.580695 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.680950 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.781738 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.882486 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.983460 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.083619 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.183778 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.284618 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.385539 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.485732 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.586776 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.687760 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.787991 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.888967 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.990208 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.091503 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.192699 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.292984 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.394146 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.494960 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.595602 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.697066 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.798474 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.898614 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.999674 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.100664 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.201435 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.301678 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.402811 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.502932 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.578457 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.583032 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.583076 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.583087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.583107 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.583119 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:33Z","lastTransitionTime":"2026-03-13T13:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.593942 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.597032 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.597071 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.597086 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.597105 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.597124 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:33Z","lastTransitionTime":"2026-03-13T13:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.607878 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.611952 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.612247 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.612465 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.612632 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.612792 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:33Z","lastTransitionTime":"2026-03-13T13:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.625274 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.630286 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.630328 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.630345 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.630367 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.630381 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:33Z","lastTransitionTime":"2026-03-13T13:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.641167 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.641287 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.641319 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.742085 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.843141 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.943282 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.044091 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.145085 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.245829 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.347720 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.448444 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.549711 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.650788 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.751652 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.853139 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.953992 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.054416 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.154981 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.255602 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.356542 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.457566 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.557708 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.658134 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.758406 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.847271 4898 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.858559 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.958720 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.058886 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.160127 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.261511 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.362685 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.463021 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.563645 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.664254 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.765402 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.866010 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.967125 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.068192 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.168670 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.268958 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.370041 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.471028 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.572355 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.673575 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.774177 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.875332 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.975461 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.076440 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.177430 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.277547 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.378304 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.479248 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.579599 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.680141 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.780528 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.881414 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.982292 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.083434 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.183927 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.237448 4898 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.286716 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.286761 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.286774 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.286793 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.286807 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:39Z","lastTransitionTime":"2026-03-13T13:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.389378 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.389428 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.389441 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.389456 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.389466 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:39Z","lastTransitionTime":"2026-03-13T13:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.491790 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.491877 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.491890 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.491928 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.491940 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:39Z","lastTransitionTime":"2026-03-13T13:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.594625 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.594680 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.594701 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.594727 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.594747 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:39Z","lastTransitionTime":"2026-03-13T13:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.697979 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.698034 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.698053 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.698077 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.698096 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:39Z","lastTransitionTime":"2026-03-13T13:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.711239 4898 apiserver.go:52] "Watching apiserver" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.719064 4898 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.719509 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.720040 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.720066 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.720147 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.720107 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.720357 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.720404 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.720611 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.720985 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.721048 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.724225 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.724511 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.726060 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.726385 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.726529 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.727223 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.727388 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.729944 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.730070 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.758033 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.775860 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.776197 4898 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.791255 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.800696 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.800780 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.800811 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.800824 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.800841 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.800854 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:39Z","lastTransitionTime":"2026-03-13T13:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.812355 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.822865 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837572 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837619 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837635 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837659 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837679 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837712 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837735 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837754 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837772 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837792 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837816 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837834 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837852 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837871 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837892 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837929 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837950 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837975 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837995 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838015 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838037 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838058 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838078 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838099 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838119 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838140 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838158 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838178 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838197 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838221 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838244 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838265 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838315 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838338 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838359 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838378 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838396 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838416 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838443 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838461 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838480 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838502 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838523 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838543 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838564 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838589 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838612 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838632 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838651 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838669 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838690 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838679 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838710 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838796 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838823 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838860 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838806 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838881 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838948 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838976 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839007 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839039 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839073 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839112 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839138 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839143 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839161 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839189 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839211 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839236 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839259 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839281 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839302 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839324 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839352 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839377 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839409 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839431 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839452 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839482 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839505 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839526 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839548 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839579 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839600 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839620 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839642 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839802 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839943 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839974 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839997 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840019 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840041 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840100 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840124 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840150 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840182 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840216 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840250 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841053 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841091 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841116 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841140 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841163 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841189 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841211 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841232 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841258 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841281 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841303 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841333 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839155 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841411 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839360 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839524 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839662 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839690 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839735 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840102 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840160 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840222 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840283 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840547 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840583 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840644 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840998 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841114 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841170 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841332 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841365 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841488 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841834 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.842481 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841369 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.842740 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.842784 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.842833 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.842890 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.842941 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.842948 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.842983 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843039 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843093 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843103 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843166 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843551 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843809 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843823 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843894 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843992 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844049 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844095 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844111 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844167 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844221 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844274 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844332 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844387 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844440 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844485 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844529 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844572 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844617 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844654 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844692 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844729 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844767 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844805 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844847 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844882 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844951 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844992 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844115 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844365 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844334 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844419 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844482 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844541 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844734 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844762 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844982 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.845044 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:57:40.345008753 +0000 UTC m=+95.346597202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848077 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848118 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848148 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848172 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848191 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848208 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848228 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848247 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848270 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848295 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848319 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848342 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848368 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848391 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848420 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848445 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848473 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848496 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848521 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848545 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848565 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848590 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848613 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848635 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848658 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848687 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848708 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848730 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848753 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848774 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848805 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848829 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848853 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848877 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848916 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848943 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848969 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848990 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849014 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849038 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849061 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849087 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849114 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849140 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849170 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849195 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849218 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849244 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849271 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849297 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849324 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849350 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849380 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849404 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849429 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849457 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849479 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849503 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849570 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849606 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849635 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849665 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849692 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849721 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849752 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849780 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849808 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849833 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849866 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849910 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849942 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849968 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850047 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850062 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850075 4898 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850089 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850104 4898 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850117 4898 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850129 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850141 4898 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850156 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850168 4898 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850181 4898 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850194 4898 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850208 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850222 4898 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850234 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850253 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850267 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850298 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850312 4898 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850324 4898 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850338 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850351 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850363 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850376 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850389 4898 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850402 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850414 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850425 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850438 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850450 4898 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850463 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850475 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850487 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850500 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850516 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850528 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850539 4898 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850552 4898 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850565 4898 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850577 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850594 4898 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850606 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852116 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848195 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848206 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.845092 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.845348 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846084 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846422 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846420 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.854653 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846567 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846701 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846846 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846872 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846402 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846992 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.847126 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.847191 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.847242 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.847336 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.847390 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.847455 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848346 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848462 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848861 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849087 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849355 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849348 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849691 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849724 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849750 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849918 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849922 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850058 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850054 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850140 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850137 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850233 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850631 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850639 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850840 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850875 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851160 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851372 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851656 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851678 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851604 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851711 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851816 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851877 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851907 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852101 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852129 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852142 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852016 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852157 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852158 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852225 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852361 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852502 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852531 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852647 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853152 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853175 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853197 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853408 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853450 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853471 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853441 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.845052 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853537 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853745 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853772 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853774 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.854115 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.854159 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.854193 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.854571 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.855202 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.855332 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.855359 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.855598 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.855689 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.855724 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.855865 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.856100 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.856349 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.856557 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.856644 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.856873 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.857253 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.857352 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.857487 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.857620 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.857628 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.857640 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.857680 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.858040 4898 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.858220 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:40.35811483 +0000 UTC m=+95.359703159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.858353 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.858456 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:40.358428777 +0000 UTC m=+95.360017236 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.858586 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.858044 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.858783 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.859282 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.859352 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.860315 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.860372 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.860682 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.860788 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.861005 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.861370 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.863626 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.864146 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.865316 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.866487 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.866589 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.866665 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.866746 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.866982 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.867171 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.872979 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.873028 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.873050 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.873143 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:40.373113569 +0000 UTC m=+95.374702018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.876948 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.879579 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.881096 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.881129 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.881143 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.881211 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:40.381192202 +0000 UTC m=+95.382780441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.883284 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.883535 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.883724 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.884999 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.886245 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.886943 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.888392 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.892255 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.892391 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.892847 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.893168 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.893472 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.894188 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.894475 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.896490 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.896911 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.898105 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.898146 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.898212 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.898493 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.898892 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.897439 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.899012 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.899040 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.899148 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.899176 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.899259 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.899335 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.899208 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.900434 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.900476 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.900661 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.900703 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.900745 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.902735 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.903093 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.903188 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.904399 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.904441 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.904456 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.904476 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.904488 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:39Z","lastTransitionTime":"2026-03-13T13:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.904818 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.921596 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.924009 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.930439 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.930540 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951661 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951761 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951836 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951866 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951921 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951940 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951950 4898 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951960 4898 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951972 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951982 4898 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951991 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952001 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951992 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952010 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952075 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952086 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952097 4898 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952106 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952116 4898 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952126 4898 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952135 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952146 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952155 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952165 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952175 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952185 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952195 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952207 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952256 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952271 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952284 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952296 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952308 4898 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952320 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952332 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952341 4898 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952352 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952363 4898 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952373 4898 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952384 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952394 4898 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952406 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952415 4898 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952425 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952436 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952448 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952457 4898 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952466 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952475 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952485 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952495 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952504 4898 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952513 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952524 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952533 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952542 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952551 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952562 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952571 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952580 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952593 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952601 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952611 4898 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952619 4898 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952629 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952638 4898 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952647 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952658 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952670 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952683 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952694 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952706 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952717 4898 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952728 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952739 4898 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952753 4898 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952765 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952779 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952790 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952805 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952818 4898 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952829 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952838 4898 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952848 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952859 4898 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952879 4898 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952891 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952923 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952935 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952949 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952964 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952976 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952995 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953007 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953020 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953030 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953038 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953047 4898 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953056 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953067 4898 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953076 4898 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953085 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953094 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953103 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953111 4898 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953120 4898 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953128 4898 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953137 4898 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953145 4898 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953154 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953162 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953171 4898 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953180 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953188 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953196 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953207 4898 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953218 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953229 4898 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953241 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953252 4898 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953264 4898 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953288 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953299 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953308 4898 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953318 4898 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953328 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953337 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953346 4898 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953357 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953366 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953374 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953383 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953391 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953399 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953408 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953416 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953424 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953433 4898 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953441 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953449 4898 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953457 4898 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953465 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953474 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953482 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953491 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953499 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953507 4898 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953515 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953524 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953532 4898 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953542 4898 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953550 4898 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953558 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953566 4898 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953575 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953584 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.006836 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.006918 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.006934 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.006953 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.006964 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.041063 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.057052 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.067832 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:40 crc kubenswrapper[4898]: W0313 13:57:40.068264 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-bb4404e056ec8640f37be83e005144bfc074c9bdee979e67e580a60d209c935d WatchSource:0}: Error finding container bb4404e056ec8640f37be83e005144bfc074c9bdee979e67e580a60d209c935d: Status 404 returned error can't find the container with id bb4404e056ec8640f37be83e005144bfc074c9bdee979e67e580a60d209c935d Mar 13 13:57:40 crc kubenswrapper[4898]: W0313 13:57:40.088908 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-db6cec27d08f373602b55043f55721f443dc167006efddf537b37281b8349270 WatchSource:0}: Error finding container db6cec27d08f373602b55043f55721f443dc167006efddf537b37281b8349270: Status 404 returned error can't find the container with id db6cec27d08f373602b55043f55721f443dc167006efddf537b37281b8349270 Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.111060 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.111454 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.111472 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.111506 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.111519 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.213839 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.213867 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.213875 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.213888 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.213911 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.277945 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"db6cec27d08f373602b55043f55721f443dc167006efddf537b37281b8349270"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.279068 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bb4404e056ec8640f37be83e005144bfc074c9bdee979e67e580a60d209c935d"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.280776 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.280820 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d979c605efac024151111eac4d3ca28abe678c91e8fc936c9ce3917bfd6bfb74"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.316577 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.316625 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.316635 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.316651 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.316661 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.357026 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.357285 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:57:41.357248194 +0000 UTC m=+96.358836433 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.418587 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.418632 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.418641 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.418657 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.418666 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.458083 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.458143 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.458168 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.458319 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458349 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458474 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:41.458452756 +0000 UTC m=+96.460041005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458487 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458372 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458531 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458546 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458546 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458579 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458591 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458595 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:41.458573368 +0000 UTC m=+96.460161647 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458628 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:41.458613159 +0000 UTC m=+96.460201438 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458653 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:41.45864155 +0000 UTC m=+96.460229829 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.521145 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.521204 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.521223 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.521248 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.521265 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.624213 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.624268 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.624286 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.624311 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.624331 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.726587 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.726647 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.726662 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.726682 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.726697 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.751859 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.752085 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.752283 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.829256 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.829296 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.829304 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.829318 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.829328 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.931883 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.931942 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.931951 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.931966 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.931978 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.033731 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.033764 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.033773 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.033787 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.033797 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.136411 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.136471 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.136489 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.136519 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.136538 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.238949 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.239025 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.239047 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.239087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.239110 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.284806 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.286965 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.287371 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.287488 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.306331 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.322858 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.335213 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.341100 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.341139 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.341150 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.341168 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.341180 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.351322 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.365983 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.367177 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.367365 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:57:43.367339769 +0000 UTC m=+98.368928008 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.378701 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.394542 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.409836 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.423375 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.436283 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.443348 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.443387 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.443396 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.443413 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.443425 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.450537 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.467219 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.467799 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.467871 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.467911 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.467959 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.467999 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468070 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468092 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468103 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468080 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:43.46806235 +0000 UTC m=+98.469650589 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468570 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:43.468556101 +0000 UTC m=+98.470144340 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468601 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468635 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:43.468624293 +0000 UTC m=+98.470212532 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468688 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468700 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468711 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468741 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:43.468734585 +0000 UTC m=+98.470322824 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.488719 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.499326 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.546959 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.547027 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.547040 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.547058 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.547085 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.650023 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.650074 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.650092 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.650120 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.650142 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.738823 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.739119 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.739264 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.739529 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.739608 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.739762 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.744428 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.745865 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.748747 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.750634 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.752395 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.752466 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.752485 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.752510 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.752529 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.753824 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.754652 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.755531 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.757298 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.758536 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.759941 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.761978 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.764189 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.765930 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.766859 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.768526 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.769309 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.770720 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.771428 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.772249 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.773752 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.774536 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.775326 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.776545 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.777633 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.779385 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.780246 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.781646 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.782425 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.783972 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.784668 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.785323 4898 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.785486 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.789170 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.790350 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.792035 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.794557 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.795514 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.796827 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.797852 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.799878 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.800582 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.802180 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.802916 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.803874 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.804343 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.805411 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.806095 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.807315 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.807802 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.808657 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.809126 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.809675 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.810708 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.811212 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.812002 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.854725 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.854779 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.854793 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.854814 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.854829 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.957697 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.957772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.957791 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.958135 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.958171 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.060349 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.060394 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.060404 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.060429 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.060440 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.163084 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.163123 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.163132 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.163149 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.163160 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.265975 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.266018 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.266029 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.266045 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.266056 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.368389 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.368460 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.368475 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.368497 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.368513 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.470650 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.470705 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.470721 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.470754 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.470771 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.574685 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.574773 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.574791 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.574818 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.574837 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.678424 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.678480 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.678497 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.678521 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.678538 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.782297 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.782440 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.782477 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.782511 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.782535 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.885236 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.885315 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.885338 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.885367 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.885392 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.988045 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.988096 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.988119 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.988142 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.988159 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.090327 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.090434 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.090451 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.090477 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.090494 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.193414 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.193468 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.193482 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.193499 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.193511 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.294728 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.295779 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.295834 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.295855 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.295877 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.295919 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.315970 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.340060 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.359951 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.380636 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.384382 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.384750 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:57:47.384630343 +0000 UTC m=+102.386218622 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.399329 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.399416 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.399434 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.399492 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.399510 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.411659 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.428043 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.446997 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.461617 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.485616 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.485665 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.485690 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.485712 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.485839 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.485859 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.485871 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.485947 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:47.485930038 +0000 UTC m=+102.487518277 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.485984 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.486103 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:47.486077561 +0000 UTC m=+102.487665840 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.486145 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.486262 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.486287 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.486308 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.486360 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:47.486331407 +0000 UTC m=+102.487919676 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.486445 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:47.486430829 +0000 UTC m=+102.488019108 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.502036 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.502085 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.502095 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.502108 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.502117 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.585839 4898 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.605006 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.605053 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.605069 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.605095 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.605111 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.713711 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.713778 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.713796 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.713822 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.713840 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.739205 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.739270 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.739359 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.739371 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.739515 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.739628 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.816927 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.816978 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.816995 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.817017 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.817034 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.919259 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.919356 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.919379 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.919405 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.919425 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.977832 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.977942 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.977955 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.977974 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.978018 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.996257 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.000532 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.000576 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.000590 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.000609 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.000628 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: E0313 13:57:44.014997 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.018825 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.018869 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.018884 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.018922 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.018936 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: E0313 13:57:44.038688 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.043973 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.044054 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.044081 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.044115 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.044141 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: E0313 13:57:44.066126 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.070743 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.070839 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.070859 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.070884 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.070947 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: E0313 13:57:44.088377 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:44 crc kubenswrapper[4898]: E0313 13:57:44.088628 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.090963 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.091029 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.091069 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.091097 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.091112 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.194352 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.194412 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.194430 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.194455 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.194477 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.297253 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.297304 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.297316 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.297334 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.297346 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.399946 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.400009 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.400025 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.400052 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.400071 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.502618 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.502674 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.502690 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.502712 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.502730 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.605361 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.605402 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.605413 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.605432 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.605443 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.708642 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.708712 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.708734 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.708765 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.708787 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.811285 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.811344 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.811353 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.811374 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.811386 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.914392 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.914462 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.914486 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.914516 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.914536 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.016978 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.017085 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.017111 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.017138 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.017156 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.119777 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.119841 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.119863 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.119889 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.119933 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.262872 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.262922 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.262931 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.262947 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.262960 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.365638 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.365672 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.365699 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.365745 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.365757 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.467753 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.467809 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.467825 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.467853 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.467870 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.571167 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.571229 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.571245 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.571272 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.571292 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.674975 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.675061 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.675088 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.675117 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.675139 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.739097 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.739101 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.739327 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:45 crc kubenswrapper[4898]: E0313 13:57:45.739322 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:45 crc kubenswrapper[4898]: E0313 13:57:45.739431 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:45 crc kubenswrapper[4898]: E0313 13:57:45.739499 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.763462 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.778346 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.778423 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.778447 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.778477 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.778500 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.782442 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.796965 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.810749 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.831941 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.847350 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.868537 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.880426 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.880489 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.880504 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.880525 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.880536 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.883414 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.983377 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.983423 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.983434 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.983452 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.983464 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.086089 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.086187 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.086207 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.086677 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.086880 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.190694 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.190797 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.190868 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.190940 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.190961 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.294015 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.294084 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.294103 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.294133 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.294151 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.397659 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.397742 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.397768 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.397797 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.397820 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.500788 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.500844 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.500856 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.500877 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.500890 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.604458 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.604503 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.604512 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.604529 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.604540 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.707429 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.707484 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.707496 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.707512 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.707523 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.811011 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.811086 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.811103 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.811128 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.811145 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.914690 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.914753 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.914772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.914799 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.914816 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.917308 4898 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.017825 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.017887 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.017944 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.017971 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.017991 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.120499 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.120557 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.120573 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.120597 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.120644 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.222894 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.222975 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.222991 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.223014 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.223031 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.325298 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.325357 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.325373 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.325395 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.325412 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.422459 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.422748 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:57:55.422729623 +0000 UTC m=+110.424317862 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.427335 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.427456 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.427604 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.427697 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.427783 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.524370 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.524465 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.524564 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.524621 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.524811 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.524964 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:55.524889777 +0000 UTC m=+110.526478056 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525109 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525231 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525336 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525250 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:55.525228095 +0000 UTC m=+110.526816334 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525374 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525491 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:55.52545706 +0000 UTC m=+110.527045339 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525514 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525579 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525602 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525663 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:55.525645524 +0000 UTC m=+110.527233843 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.530080 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.530112 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.530125 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.530141 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.530154 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.632921 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.633298 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.633439 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.633604 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.633798 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.737018 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.737083 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.737105 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.737130 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.737149 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.738547 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.738702 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.738713 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.738818 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.738561 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.738949 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.840125 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.840175 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.840192 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.840218 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.840235 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.942749 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.942829 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.942854 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.942884 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.942946 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.046034 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.046084 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.046095 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.046120 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.046131 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.148482 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.148521 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.148530 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.148543 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.148554 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.251174 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.251223 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.251234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.251253 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.251265 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.354813 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.354883 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.354940 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.354973 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.354995 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.417042 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-xpbhb"] Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.417522 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xpbhb" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.420352 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.420779 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.421566 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.433153 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0427af73-3ee1-4f8b-aa31-915d8ff53e94-hosts-file\") pod \"node-resolver-xpbhb\" (UID: \"0427af73-3ee1-4f8b-aa31-915d8ff53e94\") " pod="openshift-dns/node-resolver-xpbhb" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.433227 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndhq7\" (UniqueName: \"kubernetes.io/projected/0427af73-3ee1-4f8b-aa31-915d8ff53e94-kube-api-access-ndhq7\") pod \"node-resolver-xpbhb\" (UID: \"0427af73-3ee1-4f8b-aa31-915d8ff53e94\") " pod="openshift-dns/node-resolver-xpbhb" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.452578 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.457667 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.457725 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.457738 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.457758 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.457773 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.466691 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.483186 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.503246 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.515952 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.534423 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0427af73-3ee1-4f8b-aa31-915d8ff53e94-hosts-file\") pod \"node-resolver-xpbhb\" (UID: \"0427af73-3ee1-4f8b-aa31-915d8ff53e94\") " pod="openshift-dns/node-resolver-xpbhb" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.534486 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndhq7\" (UniqueName: \"kubernetes.io/projected/0427af73-3ee1-4f8b-aa31-915d8ff53e94-kube-api-access-ndhq7\") pod \"node-resolver-xpbhb\" (UID: \"0427af73-3ee1-4f8b-aa31-915d8ff53e94\") " pod="openshift-dns/node-resolver-xpbhb" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.534741 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0427af73-3ee1-4f8b-aa31-915d8ff53e94-hosts-file\") pod \"node-resolver-xpbhb\" (UID: \"0427af73-3ee1-4f8b-aa31-915d8ff53e94\") " pod="openshift-dns/node-resolver-xpbhb" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.535592 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.547465 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.552476 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndhq7\" (UniqueName: \"kubernetes.io/projected/0427af73-3ee1-4f8b-aa31-915d8ff53e94-kube-api-access-ndhq7\") pod \"node-resolver-xpbhb\" (UID: \"0427af73-3ee1-4f8b-aa31-915d8ff53e94\") " pod="openshift-dns/node-resolver-xpbhb" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.560334 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.560410 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.560423 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.560446 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.560459 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.563385 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.577760 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.663034 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.663078 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.663093 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.663110 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.663123 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.741044 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xpbhb" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.765210 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.765262 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.765278 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.765309 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.765325 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.803420 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-6llfs"] Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.803961 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.804142 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-8k6xj"] Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.804696 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-5qb65"] Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.805279 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.805715 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.809016 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.809380 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.809585 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.810860 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.810971 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.811116 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.811141 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.811410 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.811438 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.811764 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.812107 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.812774 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.832936 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.837456 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-system-cni-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.837492 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27j5r\" (UniqueName: \"kubernetes.io/projected/e527967a-003e-4dbe-aade-d9f882239cb0-kube-api-access-27j5r\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.837554 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzgmp\" (UniqueName: \"kubernetes.io/projected/e521c857-9711-4f68-886f-38b233d7b05b-kube-api-access-mzgmp\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.837604 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e521c857-9711-4f68-886f-38b233d7b05b-multus-daemon-config\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.837708 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-cni-multus\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.837777 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-hostroot\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.837846 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-cni-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838227 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-os-release\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838313 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-mcd-auth-proxy-config\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838366 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcf5f\" (UniqueName: \"kubernetes.io/projected/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-kube-api-access-kcf5f\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838409 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e521c857-9711-4f68-886f-38b233d7b05b-cni-binary-copy\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838452 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-kubelet\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838503 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-cnibin\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838557 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-rootfs\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838603 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-netns\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838655 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-cnibin\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838703 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-etc-kubernetes\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838745 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-proxy-tls\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838818 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-socket-dir-parent\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838890 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-k8s-cni-cncf-io\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.839038 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-cni-bin\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.839104 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.839160 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-multus-certs\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.839222 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-os-release\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.839382 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-conf-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.839422 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e527967a-003e-4dbe-aade-d9f882239cb0-cni-binary-copy\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.839487 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e527967a-003e-4dbe-aade-d9f882239cb0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.839545 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-system-cni-dir\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.850863 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.863984 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.869369 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.869403 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.869414 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.869430 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.869444 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.880005 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.893819 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.917702 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.934492 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940036 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-os-release\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940080 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcf5f\" (UniqueName: \"kubernetes.io/projected/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-kube-api-access-kcf5f\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940098 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e521c857-9711-4f68-886f-38b233d7b05b-cni-binary-copy\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940117 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-kubelet\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940132 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-cnibin\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940147 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-mcd-auth-proxy-config\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940161 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-rootfs\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940177 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-netns\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940191 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-cnibin\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940206 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-etc-kubernetes\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940221 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-socket-dir-parent\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940240 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-k8s-cni-cncf-io\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940256 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-cni-bin\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940320 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940355 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-proxy-tls\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940376 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-multus-certs\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940394 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-os-release\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940414 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-conf-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940433 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e527967a-003e-4dbe-aade-d9f882239cb0-cni-binary-copy\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940465 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e527967a-003e-4dbe-aade-d9f882239cb0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940501 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-system-cni-dir\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940526 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27j5r\" (UniqueName: \"kubernetes.io/projected/e527967a-003e-4dbe-aade-d9f882239cb0-kube-api-access-27j5r\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940544 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-system-cni-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940567 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzgmp\" (UniqueName: \"kubernetes.io/projected/e521c857-9711-4f68-886f-38b233d7b05b-kube-api-access-mzgmp\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940590 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e521c857-9711-4f68-886f-38b233d7b05b-multus-daemon-config\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940606 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-hostroot\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940623 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-cni-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940638 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-cni-multus\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940709 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-cni-multus\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940788 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-os-release\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941038 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-socket-dir-parent\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941035 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-netns\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941095 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-multus-certs\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941111 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-cnibin\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941143 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-etc-kubernetes\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941169 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-os-release\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941170 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-cni-bin\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941190 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-k8s-cni-cncf-io\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941218 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-conf-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941630 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e521c857-9711-4f68-886f-38b233d7b05b-cni-binary-copy\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941744 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-kubelet\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941776 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-cnibin\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941998 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-rootfs\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.942023 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-system-cni-dir\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.942320 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-hostroot\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.942363 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e527967a-003e-4dbe-aade-d9f882239cb0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.942386 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-system-cni-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.942375 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-mcd-auth-proxy-config\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.942433 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-cni-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.942633 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e527967a-003e-4dbe-aade-d9f882239cb0-cni-binary-copy\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.942646 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e521c857-9711-4f68-886f-38b233d7b05b-multus-daemon-config\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.946779 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-proxy-tls\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.950165 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.954787 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.959643 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcf5f\" (UniqueName: \"kubernetes.io/projected/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-kube-api-access-kcf5f\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.960267 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27j5r\" (UniqueName: \"kubernetes.io/projected/e527967a-003e-4dbe-aade-d9f882239cb0-kube-api-access-27j5r\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.964914 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzgmp\" (UniqueName: \"kubernetes.io/projected/e521c857-9711-4f68-886f-38b233d7b05b-kube-api-access-mzgmp\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.968230 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.971838 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.971924 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.971943 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.971966 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.971982 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.981056 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.996497 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.012691 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.048009 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.058184 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.070240 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.074103 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.074144 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.074157 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.074175 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.074186 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.083102 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.094483 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.105591 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.117404 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.128637 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.133384 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.140189 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: W0313 13:57:49.143486 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod767eecef_3bc9_4db4_a0cb_5d9c8554c62d.slice/crio-7a5d54afb2c298e1aa4a9af903c6e73f9527913eda210875caefada7de43e746 WatchSource:0}: Error finding container 7a5d54afb2c298e1aa4a9af903c6e73f9527913eda210875caefada7de43e746: Status 404 returned error can't find the container with id 7a5d54afb2c298e1aa4a9af903c6e73f9527913eda210875caefada7de43e746 Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.146497 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.153225 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.154279 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6llfs" Mar 13 13:57:49 crc kubenswrapper[4898]: W0313 13:57:49.158918 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode527967a_003e_4dbe_aade_d9f882239cb0.slice/crio-b2e9795f336934fef1373b0978db654178a41d18411372f547a148cefc86f61f WatchSource:0}: Error finding container b2e9795f336934fef1373b0978db654178a41d18411372f547a148cefc86f61f: Status 404 returned error can't find the container with id b2e9795f336934fef1373b0978db654178a41d18411372f547a148cefc86f61f Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.176329 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.176392 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.176408 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.176432 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.176450 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.184776 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qqqs5"] Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.186162 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.188284 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.188462 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.188491 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.188714 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.188783 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.189152 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.189244 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.200981 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.212349 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.232034 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244215 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-ovn\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244293 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-env-overrides\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244322 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovn-node-metrics-cert\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244360 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244381 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-var-lib-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244415 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-systemd-units\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244437 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-slash\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244460 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-log-socket\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244488 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-etc-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244507 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-node-log\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244530 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-netd\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244559 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-bin\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244578 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-script-lib\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244598 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-systemd\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244619 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-kubelet\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244642 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-netns\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244664 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-ovn-kubernetes\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244684 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-config\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244705 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc944\" (UniqueName: \"kubernetes.io/projected/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-kube-api-access-tc944\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244725 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.254396 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.281934 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.281973 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.281988 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.282006 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.282020 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.290708 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.315734 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.318378 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xpbhb" event={"ID":"0427af73-3ee1-4f8b-aa31-915d8ff53e94","Type":"ContainerStarted","Data":"dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.318548 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xpbhb" event={"ID":"0427af73-3ee1-4f8b-aa31-915d8ff53e94","Type":"ContainerStarted","Data":"e47906de971555f60254b72cc3296db77b315aa8afb69dc2bdc11926d7fe4f38"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.319763 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6llfs" event={"ID":"e521c857-9711-4f68-886f-38b233d7b05b","Type":"ContainerStarted","Data":"c012adc2c459677f7c64d2810bccb2824067ed9f0356d0d528ffe20e674f8d93"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.320730 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"7a5d54afb2c298e1aa4a9af903c6e73f9527913eda210875caefada7de43e746"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.321694 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerStarted","Data":"b2e9795f336934fef1373b0978db654178a41d18411372f547a148cefc86f61f"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.335100 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345204 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-ovn\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345251 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-env-overrides\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345279 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovn-node-metrics-cert\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345314 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345327 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-ovn\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345394 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-systemd-units\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345347 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-systemd-units\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345445 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-slash\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345447 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345465 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-var-lib-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345497 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-var-lib-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345521 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-log-socket\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345551 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-log-socket\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345557 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-etc-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345587 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-etc-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345588 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-node-log\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345622 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-netd\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345529 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-slash\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345656 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-bin\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345658 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-netd\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345622 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-node-log\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345703 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-bin\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345787 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-script-lib\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345854 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-systemd\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345879 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-kubelet\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345954 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-systemd\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345970 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-kubelet\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345998 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-env-overrides\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346030 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-netns\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346066 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-netns\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346092 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-ovn-kubernetes\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346110 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-config\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346124 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc944\" (UniqueName: \"kubernetes.io/projected/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-kube-api-access-tc944\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346144 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346209 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-ovn-kubernetes\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346317 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346388 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-script-lib\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.347154 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-config\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.347963 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.351463 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovn-node-metrics-cert\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.362709 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc944\" (UniqueName: \"kubernetes.io/projected/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-kube-api-access-tc944\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.363786 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.379905 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.386586 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.386629 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.386641 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.386659 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.386673 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.393464 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.406623 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.424112 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.489531 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.489577 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.489586 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.489600 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.489610 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.512872 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: W0313 13:57:49.553727 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7d6afc0_d9b5_41b2_a55f_57621c300cbb.slice/crio-064d66ce778a8d0d979727a052c6e1249a726f86c9609bd927debcbbf5923b70 WatchSource:0}: Error finding container 064d66ce778a8d0d979727a052c6e1249a726f86c9609bd927debcbbf5923b70: Status 404 returned error can't find the container with id 064d66ce778a8d0d979727a052c6e1249a726f86c9609bd927debcbbf5923b70 Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.593283 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.593326 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.593336 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.593373 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.593383 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.695462 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.695502 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.695514 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.695535 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.695544 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.739350 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:49 crc kubenswrapper[4898]: E0313 13:57:49.739537 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.739599 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:49 crc kubenswrapper[4898]: E0313 13:57:49.739805 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.740014 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:49 crc kubenswrapper[4898]: E0313 13:57:49.740787 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.798271 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.798312 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.798321 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.798343 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.798355 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.900999 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.901494 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.901507 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.901523 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.901535 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.006010 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.006053 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.006064 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.006082 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.006094 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.108356 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.108566 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.108626 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.108686 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.108769 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.211814 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.211857 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.211869 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.211888 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.211920 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.314234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.314275 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.314283 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.314299 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.314309 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.327431 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6llfs" event={"ID":"e521c857-9711-4f68-886f-38b233d7b05b","Type":"ContainerStarted","Data":"de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.329875 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.329962 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.331432 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerStarted","Data":"04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.333347 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786" exitCode=0 Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.333469 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.333572 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"064d66ce778a8d0d979727a052c6e1249a726f86c9609bd927debcbbf5923b70"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.349203 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.367029 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.383092 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.397837 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.410466 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.417213 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.417254 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.417265 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.417282 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.417294 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.426455 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.449081 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.466160 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.481535 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.499156 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.515729 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.520689 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.520734 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.520749 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.520772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.520789 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.531319 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.545840 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.558780 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.570767 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.586201 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.608187 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.623727 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.623766 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.623775 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.623791 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.623801 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.627335 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.645206 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.657759 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.672177 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.688611 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.706107 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.723529 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.726536 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.726607 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.726635 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.726665 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.726684 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.741689 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.761044 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.829990 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.830031 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.830042 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.830063 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.830075 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.933379 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.933462 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.933489 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.933526 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.933552 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.036333 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.036415 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.036438 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.036463 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.036482 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.139696 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.139745 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.139757 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.139781 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.139794 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.241967 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.242027 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.242043 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.242063 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.242078 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.338197 4898 generic.go:334] "Generic (PLEG): container finished" podID="e527967a-003e-4dbe-aade-d9f882239cb0" containerID="04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8" exitCode=0 Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.338271 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerDied","Data":"04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.343867 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.343953 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.343967 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.343984 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.343995 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.345764 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.345867 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.345881 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.345912 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.345926 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.345937 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.358573 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.372412 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.387373 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.400538 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.423939 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.440286 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.447548 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.447578 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.447587 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.447621 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.447632 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.452440 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.472763 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.489525 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.501167 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.515623 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.534206 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.548423 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.550056 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.550091 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.550105 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.550163 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.550181 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.653444 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.653491 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.653501 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.653517 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.653528 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.738814 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:51 crc kubenswrapper[4898]: E0313 13:57:51.739096 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.739695 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:51 crc kubenswrapper[4898]: E0313 13:57:51.739809 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.739984 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:51 crc kubenswrapper[4898]: E0313 13:57:51.740107 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.756931 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.756998 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.757023 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.757100 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.757130 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.860262 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.860321 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.860340 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.860363 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.860380 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.963338 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.963779 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.963795 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.963842 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.963856 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.066730 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.066810 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.066832 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.066862 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.066880 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.172997 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.173066 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.173087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.173111 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.173127 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.276222 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.276313 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.276333 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.276355 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.276370 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.351608 4898 generic.go:334] "Generic (PLEG): container finished" podID="e527967a-003e-4dbe-aade-d9f882239cb0" containerID="52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc" exitCode=0 Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.351675 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerDied","Data":"52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.373770 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.379055 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.379097 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.379111 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.379133 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.379146 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.398872 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.423986 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.445980 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.463380 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.479705 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.482145 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.482221 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.482235 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.482258 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.482272 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.495854 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.511700 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.531507 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.546402 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.562774 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.578568 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.585212 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.585253 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.585264 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.585286 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.585298 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.593524 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.688353 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.688703 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.688725 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.688754 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.688770 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.739859 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.792768 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.792821 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.792834 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.792859 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.792875 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.898224 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.898290 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.898309 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.898346 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.898366 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.000741 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.000778 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.000790 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.000809 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.000822 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.103822 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.103884 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.103931 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.103958 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.103978 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.207747 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.207810 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.207822 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.207842 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.207854 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.310790 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.310840 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.310853 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.310872 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.310884 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.360699 4898 generic.go:334] "Generic (PLEG): container finished" podID="e527967a-003e-4dbe-aade-d9f882239cb0" containerID="f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4" exitCode=0 Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.360846 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerDied","Data":"f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.364563 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.370280 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.372747 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.384025 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.407424 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.414645 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.414690 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.414702 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.414724 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.414737 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.433832 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.452221 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.469363 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.484618 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.501559 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.517670 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.519932 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.519984 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.519995 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.520016 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.520029 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.537927 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.551601 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.563055 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.574796 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.586429 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.609082 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.623323 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.623382 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.623398 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.623424 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.623440 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.627042 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.648052 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.664386 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.682765 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.698859 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.715667 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.726489 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.726534 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.726548 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.726569 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.726582 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.731031 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.738533 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.738620 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:53 crc kubenswrapper[4898]: E0313 13:57:53.738661 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.738620 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:53 crc kubenswrapper[4898]: E0313 13:57:53.738801 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:53 crc kubenswrapper[4898]: E0313 13:57:53.738872 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.753021 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.774058 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.789216 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.804495 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.816891 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.829122 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.829174 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.829187 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.829210 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.829224 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.932707 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.932755 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.932765 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.932783 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.932792 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.036558 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.036622 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.036643 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.036667 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.036689 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.139022 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.139075 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.139089 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.139112 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.139129 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.240094 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.240150 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.240164 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.240199 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.240215 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: E0313 13:57:54.254314 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.258858 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.258911 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.258925 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.258945 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.258956 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: E0313 13:57:54.274591 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.278585 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.278631 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.278640 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.278659 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.278670 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: E0313 13:57:54.295078 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.299087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.299139 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.299151 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.299172 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.299186 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: E0313 13:57:54.313625 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.318741 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.318812 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.318826 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.318850 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.318869 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: E0313 13:57:54.334047 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: E0313 13:57:54.334288 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.336172 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.336216 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.336231 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.336253 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.336273 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.377528 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.380293 4898 generic.go:334] "Generic (PLEG): container finished" podID="e527967a-003e-4dbe-aade-d9f882239cb0" containerID="ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c" exitCode=0 Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.380336 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerDied","Data":"ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.408052 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.439416 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.439457 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.439469 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.439489 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.439522 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.439742 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.455086 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.467308 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.481936 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.500173 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.519268 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.532714 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.543475 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.543545 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.543575 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.543642 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.543658 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.544384 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.556543 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.587999 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.618208 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.638921 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.646583 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.646619 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.646629 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.646645 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.646655 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.750198 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.750248 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.750257 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.750276 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.750286 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.853158 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.853215 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.853234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.853260 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.853274 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.956273 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.956346 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.956369 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.956398 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.956441 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.059759 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.059855 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.059879 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.059961 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.059986 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.163048 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.163547 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.163751 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.163991 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.164160 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.205581 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-b46ld"] Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.206447 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.209802 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.210329 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.211749 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.212129 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.224321 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.248060 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.267919 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.267975 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.267990 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.268013 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.268026 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.270449 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.306205 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.318311 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgr4p\" (UniqueName: \"kubernetes.io/projected/a1f79182-c06d-47d7-bed8-109c0cc4784e-kube-api-access-fgr4p\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.318739 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1f79182-c06d-47d7-bed8-109c0cc4784e-host\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.318973 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a1f79182-c06d-47d7-bed8-109c0cc4784e-serviceca\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.329770 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.348146 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.366156 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.371730 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.371814 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.371841 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.371873 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.371895 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.388581 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerStarted","Data":"54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.399794 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.420670 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgr4p\" (UniqueName: \"kubernetes.io/projected/a1f79182-c06d-47d7-bed8-109c0cc4784e-kube-api-access-fgr4p\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.421341 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1f79182-c06d-47d7-bed8-109c0cc4784e-host\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.421414 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1f79182-c06d-47d7-bed8-109c0cc4784e-host\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.421679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a1f79182-c06d-47d7-bed8-109c0cc4784e-serviceca\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.423208 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a1f79182-c06d-47d7-bed8-109c0cc4784e-serviceca\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.424753 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.444851 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.447948 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgr4p\" (UniqueName: \"kubernetes.io/projected/a1f79182-c06d-47d7-bed8-109c0cc4784e-kube-api-access-fgr4p\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.460251 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.474320 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.474366 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.474376 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.474391 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.474403 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.479794 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.493133 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.505456 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.522369 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.522536 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:58:11.522506026 +0000 UTC m=+126.524094315 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.524347 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.538253 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: W0313 13:57:55.539247 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1f79182_c06d_47d7_bed8_109c0cc4784e.slice/crio-3f25ce8be07a0a1ad63e2498c31080406671dc931ee116f64d6ba7df745afbc7 WatchSource:0}: Error finding container 3f25ce8be07a0a1ad63e2498c31080406671dc931ee116f64d6ba7df745afbc7: Status 404 returned error can't find the container with id 3f25ce8be07a0a1ad63e2498c31080406671dc931ee116f64d6ba7df745afbc7 Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.554983 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.571510 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.576780 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.576846 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.576856 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.576873 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.577255 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.585208 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.600984 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.615690 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.623392 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.623461 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.623497 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.623526 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623626 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623658 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623672 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623712 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623730 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:11.623710038 +0000 UTC m=+126.625298447 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623735 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623756 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623790 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:11.6237792 +0000 UTC m=+126.625367649 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623793 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623864 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:11.623841111 +0000 UTC m=+126.625429360 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623938 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623977 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:11.623968684 +0000 UTC m=+126.625556943 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.637473 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.652406 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.664777 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.678385 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.680057 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.680104 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.680118 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.680141 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.680156 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.690759 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.704364 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.719922 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.734625 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.738972 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.738976 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.739080 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.739187 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.739351 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.739522 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.754939 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.770627 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.783407 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.783444 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.783455 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.783475 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.783491 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.784609 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.798206 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.819077 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.835229 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.851073 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.869408 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.886747 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.886789 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.886799 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.886819 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.886830 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.888668 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.903981 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.916191 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.926810 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.939465 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.958603 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.989472 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.989565 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.989582 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.989612 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.989625 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.092263 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.092296 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.092304 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.092320 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.092330 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.194564 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.194596 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.194605 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.194619 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.194629 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.298016 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.298106 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.298118 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.298136 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.298146 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.397622 4898 generic.go:334] "Generic (PLEG): container finished" podID="e527967a-003e-4dbe-aade-d9f882239cb0" containerID="54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5" exitCode=0 Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.397681 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerDied","Data":"54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.400219 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.400246 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.400258 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.400275 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.400292 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.401084 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b46ld" event={"ID":"a1f79182-c06d-47d7-bed8-109c0cc4784e","Type":"ContainerStarted","Data":"cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.401164 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b46ld" event={"ID":"a1f79182-c06d-47d7-bed8-109c0cc4784e","Type":"ContainerStarted","Data":"3f25ce8be07a0a1ad63e2498c31080406671dc931ee116f64d6ba7df745afbc7"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.409632 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.410029 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.410048 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.410064 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.438519 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.443593 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.443707 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.461088 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.488430 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.504724 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.504762 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.504773 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.504792 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.504805 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.511094 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.541054 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.558979 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.575465 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.591971 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.615204 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.615685 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.615722 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.615730 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.615744 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.615753 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.634726 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.649413 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.663731 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.679422 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.696431 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.712441 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.719152 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.719193 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.719205 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.719227 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.719242 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.731059 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.742711 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.764877 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.779478 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.799691 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.813448 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.822096 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.822139 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.822152 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.822170 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.822185 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.834146 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.853761 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.873729 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.893403 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.910552 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.923072 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.924840 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.924876 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.924888 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.924923 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.924936 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.937578 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.028595 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.028671 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.028687 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.028720 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.028737 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.131771 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.131814 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.131827 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.131844 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.131856 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.235160 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.235267 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.235289 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.235316 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.235335 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.338418 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.338476 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.338492 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.338517 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.338533 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.418980 4898 generic.go:334] "Generic (PLEG): container finished" podID="e527967a-003e-4dbe-aade-d9f882239cb0" containerID="dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd" exitCode=0 Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.419095 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerDied","Data":"dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.441767 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.441824 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.441836 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.441852 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.441862 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.448800 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.468785 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.488660 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.506299 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.529876 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.545232 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.545281 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.545299 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.545324 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.545344 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.553450 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.581320 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.600313 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.621029 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.638267 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.650568 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.650863 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.651029 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.651153 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.651272 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.661482 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.684871 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.702575 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.721220 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.739519 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.739564 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.739609 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:57 crc kubenswrapper[4898]: E0313 13:57:57.739698 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:57 crc kubenswrapper[4898]: E0313 13:57:57.739890 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:57 crc kubenswrapper[4898]: E0313 13:57:57.740002 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.755209 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.755264 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.755273 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.755295 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.755306 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.858140 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.858601 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.858611 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.858628 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.858639 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.972350 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.972395 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.972407 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.972426 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.972436 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.075915 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.075966 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.075976 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.075994 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.076006 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.178521 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.178558 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.178570 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.178588 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.178598 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.282492 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.282544 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.282560 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.282581 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.282594 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.384964 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.385008 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.385017 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.385032 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.385043 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.424591 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerStarted","Data":"b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.438286 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.458118 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.470415 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.482634 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.487754 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.487786 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.487797 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.487816 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.487830 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.498853 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.524060 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.549738 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.561631 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.571759 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.584612 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.591156 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.591198 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.591212 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.591234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.591247 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.596414 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.609192 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.627045 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.639455 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.693633 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.693683 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.693692 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.693713 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.693723 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.796960 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.797004 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.797014 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.797029 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.797041 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.900495 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.900572 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.900597 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.900627 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.900650 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.003215 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.003253 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.003266 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.003283 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.003294 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.107041 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.107155 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.107179 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.107209 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.107231 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.210787 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.210842 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.210857 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.210883 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.210918 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.313189 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.313233 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.313250 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.313273 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.313290 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.416079 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.416122 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.416137 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.416160 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.416179 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.430827 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/0.log" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.434786 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e" exitCode=1 Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.434991 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.436236 4898 scope.go:117] "RemoveContainer" containerID="119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.453455 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.474273 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.487695 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.503463 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.518979 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.519065 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.519078 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.519102 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.519121 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.522858 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.539817 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.594055 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.613177 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.621546 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.621726 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.621862 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.622026 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.622150 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.635757 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:57:59Z\\\",\\\"message\\\":\\\"tes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 13:57:59.029739 6719 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 13:57:59.030788 6719 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 13:57:59.030840 6719 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 13:57:59.030855 6719 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 13:57:59.030975 6719 factory.go:656] Stopping watch factory\\\\nI0313 13:57:59.031099 6719 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 13:57:59.031118 6719 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 13:57:59.031129 6719 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 13:57:59.031126 6719 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 13:57:59.031137 6719 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 13:57:59.031251 6719 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 13:57:59.031254 6719 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.655628 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.671635 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.689113 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.702261 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.716959 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.724776 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.724841 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.724853 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.724873 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.724886 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.738997 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.739045 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.739047 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:59 crc kubenswrapper[4898]: E0313 13:57:59.739141 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:59 crc kubenswrapper[4898]: E0313 13:57:59.739333 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:59 crc kubenswrapper[4898]: E0313 13:57:59.739553 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.827984 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.828031 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.828043 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.828060 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.828074 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.930605 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.930646 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.930655 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.930671 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.930681 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.033605 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.033650 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.033661 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.033680 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.033691 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.136554 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.136655 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.136673 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.136702 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.137085 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.241195 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.241273 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.241296 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.241352 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.241373 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.344657 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.344735 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.344762 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.344792 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.344813 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.442048 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/1.log" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.442778 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/0.log" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.446637 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.446683 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.446695 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.446715 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.446729 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.448067 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f" exitCode=1 Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.448137 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.448253 4898 scope.go:117] "RemoveContainer" containerID="119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.449560 4898 scope.go:117] "RemoveContainer" containerID="14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f" Mar 13 13:58:00 crc kubenswrapper[4898]: E0313 13:58:00.450028 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.487714 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.523262 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.550035 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.550091 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.550105 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.550130 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.550147 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.557183 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.576536 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:57:59Z\\\",\\\"message\\\":\\\"tes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 13:57:59.029739 6719 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 13:57:59.030788 6719 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 13:57:59.030840 6719 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 13:57:59.030855 6719 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 13:57:59.030975 6719 factory.go:656] Stopping watch factory\\\\nI0313 13:57:59.031099 6719 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 13:57:59.031118 6719 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 13:57:59.031129 6719 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 13:57:59.031126 6719 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 13:57:59.031137 6719 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 13:57:59.031251 6719 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 13:57:59.031254 6719 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.591160 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.605484 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.621528 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.640079 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.653485 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.653565 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.653596 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.653626 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.653646 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.656243 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.667357 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.681836 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.695423 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.709654 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.723384 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.755604 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.755635 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.755646 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.755662 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.755671 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.858662 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.858745 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.858772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.858800 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.858821 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.962190 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.962249 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.962265 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.962289 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.962308 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.065530 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.065579 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.065593 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.065611 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.065622 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.168817 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.168855 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.168866 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.168883 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.168915 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.226262 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt"] Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.227127 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.230769 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.231801 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.249260 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.262891 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.271518 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.271564 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.271589 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.271619 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.271638 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.276782 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.290740 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.296308 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.296362 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.296401 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.296435 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjttd\" (UniqueName: \"kubernetes.io/projected/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-kube-api-access-gjttd\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.305492 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.330747 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.344610 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.365234 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.374677 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.374768 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.374784 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.374805 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.374823 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.380510 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.397554 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.397602 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.397657 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.397691 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjttd\" (UniqueName: \"kubernetes.io/projected/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-kube-api-access-gjttd\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.398300 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.398418 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.404111 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:57:59Z\\\",\\\"message\\\":\\\"tes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 13:57:59.029739 6719 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 13:57:59.030788 6719 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 13:57:59.030840 6719 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 13:57:59.030855 6719 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 13:57:59.030975 6719 factory.go:656] Stopping watch factory\\\\nI0313 13:57:59.031099 6719 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 13:57:59.031118 6719 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 13:57:59.031129 6719 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 13:57:59.031126 6719 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 13:57:59.031137 6719 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 13:57:59.031251 6719 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 13:57:59.031254 6719 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.405159 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.413748 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjttd\" (UniqueName: \"kubernetes.io/projected/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-kube-api-access-gjttd\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.420869 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.434456 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.449254 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.453478 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/1.log" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.466972 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.477925 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.477992 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.478002 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.478016 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.478027 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.480242 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.545836 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: W0313 13:58:01.570452 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33fb2f7c_9abf_45e0_af55_e8f7c09c2dc3.slice/crio-54ed36a1cbac66f502e27d143ae1755d96901a5e3094426f4f3095a6102fc60d WatchSource:0}: Error finding container 54ed36a1cbac66f502e27d143ae1755d96901a5e3094426f4f3095a6102fc60d: Status 404 returned error can't find the container with id 54ed36a1cbac66f502e27d143ae1755d96901a5e3094426f4f3095a6102fc60d Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.584240 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.584299 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.584663 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.584719 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.584746 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.688341 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.688391 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.688405 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.688432 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.688449 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.739595 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.739609 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:01 crc kubenswrapper[4898]: E0313 13:58:01.739795 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.739745 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:01 crc kubenswrapper[4898]: E0313 13:58:01.739947 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:01 crc kubenswrapper[4898]: E0313 13:58:01.740145 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.791290 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.791344 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.791363 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.791386 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.791403 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.895130 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.895180 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.895196 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.895219 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.895236 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.993160 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-fwrwc"] Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.993706 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:01 crc kubenswrapper[4898]: E0313 13:58:01.993767 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.998801 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.998831 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.998841 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.998878 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.998911 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.004463 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9k7q\" (UniqueName: \"kubernetes.io/projected/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-kube-api-access-l9k7q\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.004508 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.013825 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.028350 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.041187 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.054365 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.071433 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.094683 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.101354 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.101491 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.101741 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.102014 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.102211 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.105875 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9k7q\" (UniqueName: \"kubernetes.io/projected/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-kube-api-access-l9k7q\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.105934 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:02 crc kubenswrapper[4898]: E0313 13:58:02.106096 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:02 crc kubenswrapper[4898]: E0313 13:58:02.106156 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs podName:9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:02.606141123 +0000 UTC m=+117.607729362 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs") pod "network-metrics-daemon-fwrwc" (UID: "9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.113645 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.128454 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9k7q\" (UniqueName: \"kubernetes.io/projected/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-kube-api-access-l9k7q\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.129740 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.140919 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.165282 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.180115 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.193984 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.205172 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.205242 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.205272 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.205673 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.205873 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.222398 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:57:59Z\\\",\\\"message\\\":\\\"tes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 13:57:59.029739 6719 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 13:57:59.030788 6719 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 13:57:59.030840 6719 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 13:57:59.030855 6719 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 13:57:59.030975 6719 factory.go:656] Stopping watch factory\\\\nI0313 13:57:59.031099 6719 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 13:57:59.031118 6719 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 13:57:59.031129 6719 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 13:57:59.031126 6719 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 13:57:59.031137 6719 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 13:57:59.031251 6719 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 13:57:59.031254 6719 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.240088 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.254119 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.269478 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.309486 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.309724 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.309845 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.309975 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.310168 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.414003 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.414070 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.414094 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.414125 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.414148 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.465580 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" event={"ID":"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3","Type":"ContainerStarted","Data":"421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.465953 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" event={"ID":"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3","Type":"ContainerStarted","Data":"73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.466044 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" event={"ID":"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3","Type":"ContainerStarted","Data":"54ed36a1cbac66f502e27d143ae1755d96901a5e3094426f4f3095a6102fc60d"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.486286 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.506609 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.518036 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.518112 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.518137 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.518166 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.518188 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.533958 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:57:59Z\\\",\\\"message\\\":\\\"tes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 13:57:59.029739 6719 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 13:57:59.030788 6719 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 13:57:59.030840 6719 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 13:57:59.030855 6719 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 13:57:59.030975 6719 factory.go:656] Stopping watch factory\\\\nI0313 13:57:59.031099 6719 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 13:57:59.031118 6719 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 13:57:59.031129 6719 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 13:57:59.031126 6719 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 13:57:59.031137 6719 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 13:57:59.031251 6719 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 13:57:59.031254 6719 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.549477 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.566718 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.578630 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.591220 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.602712 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.611621 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:02 crc kubenswrapper[4898]: E0313 13:58:02.611877 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:02 crc kubenswrapper[4898]: E0313 13:58:02.612017 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs podName:9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:03.611999929 +0000 UTC m=+118.613588168 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs") pod "network-metrics-daemon-fwrwc" (UID: "9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.616103 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.620087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.620149 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.620168 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.620193 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.620212 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.625970 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.637235 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.651813 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.664695 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.678883 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.706565 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.723169 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.723211 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.723222 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.723240 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.723257 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.727299 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.825511 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.825570 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.825590 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.825616 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.825635 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.928692 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.928755 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.928774 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.928797 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.928811 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.031876 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.031979 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.032004 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.032035 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.032058 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.134452 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.134516 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.134533 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.134556 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.134575 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.237660 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.237718 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.237730 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.237750 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.237763 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.340143 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.340183 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.340192 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.340205 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.340214 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.443072 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.443136 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.443147 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.443164 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.443176 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.545791 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.545823 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.545832 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.545847 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.545856 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.630421 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:03 crc kubenswrapper[4898]: E0313 13:58:03.630697 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:03 crc kubenswrapper[4898]: E0313 13:58:03.630822 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs podName:9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:05.630756901 +0000 UTC m=+120.632345170 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs") pod "network-metrics-daemon-fwrwc" (UID: "9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.649176 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.649267 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.649290 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.649324 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.649348 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.738837 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:03 crc kubenswrapper[4898]: E0313 13:58:03.739116 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.739176 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.739263 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.739262 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:03 crc kubenswrapper[4898]: E0313 13:58:03.741203 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:03 crc kubenswrapper[4898]: E0313 13:58:03.741290 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:03 crc kubenswrapper[4898]: E0313 13:58:03.741787 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.753277 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.753327 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.753344 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.753368 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.753385 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.754286 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.856651 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.857026 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.857142 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.857229 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.857300 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.960737 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.961098 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.961231 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.961404 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.961602 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.064889 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.065218 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.065306 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.065398 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.065476 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.169099 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.169165 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.169179 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.169206 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.169227 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.272130 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.272167 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.272183 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.272204 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.272219 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.376725 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.376757 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.376765 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.376781 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.376791 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.479684 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.480040 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.480207 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.480302 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.480381 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.584214 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.584552 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.584573 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.584599 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.584617 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.646002 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.646068 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.646085 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.646110 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.646127 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: E0313 13:58:04.667867 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:04Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.673656 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.673726 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.673739 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.673772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.673827 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: E0313 13:58:04.694138 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:04Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.699910 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.699972 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.699984 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.700005 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.700018 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: E0313 13:58:04.720979 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:04Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.727293 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.727343 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.727357 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.727377 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.727394 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: E0313 13:58:04.745063 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:04Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.749914 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.749969 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.749984 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.750007 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.750020 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: E0313 13:58:04.765793 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:04Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:04 crc kubenswrapper[4898]: E0313 13:58:04.766053 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.768675 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.768741 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.768757 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.768784 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.768804 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.871555 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.871621 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.871633 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.871653 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.871666 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.977989 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.978055 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.978085 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.978115 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.978136 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.081375 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.081477 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.081501 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.081534 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.081559 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:05Z","lastTransitionTime":"2026-03-13T13:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.184330 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.184420 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.184439 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.184465 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.184484 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:05Z","lastTransitionTime":"2026-03-13T13:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.288328 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.288390 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.288403 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.288422 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.288435 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:05Z","lastTransitionTime":"2026-03-13T13:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.391879 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.391968 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.391989 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.392016 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.392034 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:05Z","lastTransitionTime":"2026-03-13T13:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.494429 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.494456 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.494466 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.494483 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.494496 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:05Z","lastTransitionTime":"2026-03-13T13:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.597554 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.597618 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.597637 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.597665 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.597684 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:05Z","lastTransitionTime":"2026-03-13T13:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.654474 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:05 crc kubenswrapper[4898]: E0313 13:58:05.654675 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:05 crc kubenswrapper[4898]: E0313 13:58:05.654770 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs podName:9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:09.654752497 +0000 UTC m=+124.656340746 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs") pod "network-metrics-daemon-fwrwc" (UID: "9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:05 crc kubenswrapper[4898]: E0313 13:58:05.698129 4898 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.739332 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.739449 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.739800 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:05 crc kubenswrapper[4898]: E0313 13:58:05.739952 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.740083 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:05 crc kubenswrapper[4898]: E0313 13:58:05.740261 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:05 crc kubenswrapper[4898]: E0313 13:58:05.740462 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:05 crc kubenswrapper[4898]: E0313 13:58:05.740984 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.759857 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.775887 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.791613 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.805805 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.827687 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.853608 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: E0313 13:58:05.867793 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.874451 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.895565 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.917719 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.943771 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.957210 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.973648 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.988169 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.003497 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.018160 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.033385 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.054256 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:57:59Z\\\",\\\"message\\\":\\\"tes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 13:57:59.029739 6719 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 13:57:59.030788 6719 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 13:57:59.030840 6719 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 13:57:59.030855 6719 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 13:57:59.030975 6719 factory.go:656] Stopping watch factory\\\\nI0313 13:57:59.031099 6719 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 13:57:59.031118 6719 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 13:57:59.031129 6719 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 13:57:59.031126 6719 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 13:57:59.031137 6719 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 13:57:59.031251 6719 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 13:57:59.031254 6719 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.401848 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.411200 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.420202 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.429028 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.439543 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.452623 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.484298 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.496724 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.512620 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.523111 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.536926 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.558093 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:57:59Z\\\",\\\"message\\\":\\\"tes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 13:57:59.029739 6719 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 13:57:59.030788 6719 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 13:57:59.030840 6719 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 13:57:59.030855 6719 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 13:57:59.030975 6719 factory.go:656] Stopping watch factory\\\\nI0313 13:57:59.031099 6719 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 13:57:59.031118 6719 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 13:57:59.031129 6719 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 13:57:59.031126 6719 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 13:57:59.031137 6719 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 13:57:59.031251 6719 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 13:57:59.031254 6719 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.571388 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.584072 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.595087 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.606911 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.618441 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.629252 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:07 crc kubenswrapper[4898]: I0313 13:58:07.739098 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:07 crc kubenswrapper[4898]: I0313 13:58:07.739154 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:07 crc kubenswrapper[4898]: I0313 13:58:07.739209 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:07 crc kubenswrapper[4898]: E0313 13:58:07.739375 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:07 crc kubenswrapper[4898]: I0313 13:58:07.739417 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:07 crc kubenswrapper[4898]: E0313 13:58:07.739546 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:07 crc kubenswrapper[4898]: E0313 13:58:07.739637 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:07 crc kubenswrapper[4898]: E0313 13:58:07.739961 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:09 crc kubenswrapper[4898]: I0313 13:58:09.703152 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:09 crc kubenswrapper[4898]: E0313 13:58:09.703391 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:09 crc kubenswrapper[4898]: E0313 13:58:09.703495 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs podName:9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:17.703469547 +0000 UTC m=+132.705057826 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs") pod "network-metrics-daemon-fwrwc" (UID: "9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:09 crc kubenswrapper[4898]: I0313 13:58:09.739092 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:09 crc kubenswrapper[4898]: I0313 13:58:09.739167 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:09 crc kubenswrapper[4898]: I0313 13:58:09.739175 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:09 crc kubenswrapper[4898]: I0313 13:58:09.739123 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:09 crc kubenswrapper[4898]: E0313 13:58:09.739338 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:09 crc kubenswrapper[4898]: E0313 13:58:09.739449 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:09 crc kubenswrapper[4898]: E0313 13:58:09.739736 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:09 crc kubenswrapper[4898]: E0313 13:58:09.739837 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:10 crc kubenswrapper[4898]: E0313 13:58:10.869944 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.622638 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.622952 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:58:43.622872765 +0000 UTC m=+158.624461044 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.723695 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.723768 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.723813 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.723855 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.723984 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724082 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:43.724062076 +0000 UTC m=+158.725650325 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.723983 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724119 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724157 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724176 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724130 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:43.724122128 +0000 UTC m=+158.725710377 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.723997 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724293 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:43.724270221 +0000 UTC m=+158.725858490 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724320 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724340 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724395 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:43.724381314 +0000 UTC m=+158.725969653 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.739575 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.739614 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.739575 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.739739 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.739759 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.739945 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.740074 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.740140 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.739224 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.739225 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:13 crc kubenswrapper[4898]: E0313 13:58:13.740222 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.739518 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.739298 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:13 crc kubenswrapper[4898]: E0313 13:58:13.740370 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:13 crc kubenswrapper[4898]: E0313 13:58:13.740306 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:13 crc kubenswrapper[4898]: E0313 13:58:13.740561 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.742708 4898 scope.go:117] "RemoveContainer" containerID="14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.777219 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.795783 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.817400 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.831424 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.846943 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.881343 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.904981 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.919605 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.933709 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.951698 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.970189 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.985634 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.997511 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.008640 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.019378 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.032006 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.045828 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.515748 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/1.log" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.518973 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc"} Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.519507 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.537588 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.552418 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.565243 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.577456 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.592482 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.612026 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.625351 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.640388 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.655560 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.681297 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.696083 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.709542 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.726944 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.741792 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.760128 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.782193 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.800889 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.879039 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.879097 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.879113 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.879133 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.879147 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:14Z","lastTransitionTime":"2026-03-13T13:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:14 crc kubenswrapper[4898]: E0313 13:58:14.893054 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.898689 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.898739 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.898758 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.898782 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.898800 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:14Z","lastTransitionTime":"2026-03-13T13:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:14 crc kubenswrapper[4898]: E0313 13:58:14.919759 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.925006 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.925058 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.925072 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.925092 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.925106 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:14Z","lastTransitionTime":"2026-03-13T13:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:14 crc kubenswrapper[4898]: E0313 13:58:14.939423 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.943191 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.943235 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.943247 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.943264 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.943278 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:14Z","lastTransitionTime":"2026-03-13T13:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:14 crc kubenswrapper[4898]: E0313 13:58:14.957210 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.961613 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.961646 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.961655 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.961671 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.961682 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:14Z","lastTransitionTime":"2026-03-13T13:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:14 crc kubenswrapper[4898]: E0313 13:58:14.975607 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: E0313 13:58:14.975834 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.525318 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/2.log" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.527092 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/1.log" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.531583 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc" exitCode=1 Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.531631 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc"} Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.531670 4898 scope.go:117] "RemoveContainer" containerID="14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.532645 4898 scope.go:117] "RemoveContainer" containerID="c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc" Mar 13 13:58:15 crc kubenswrapper[4898]: E0313 13:58:15.532940 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.552221 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.566840 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.581507 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.595461 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.616333 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.629212 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.642410 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.652328 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.664122 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.678283 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.696599 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.707067 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.720188 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.734743 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.738472 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.738496 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.738548 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.738549 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:15 crc kubenswrapper[4898]: E0313 13:58:15.738595 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:15 crc kubenswrapper[4898]: E0313 13:58:15.738721 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:15 crc kubenswrapper[4898]: E0313 13:58:15.738811 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:15 crc kubenswrapper[4898]: E0313 13:58:15.738997 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.769724 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.781356 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.794345 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.806396 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.816703 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.828552 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.841148 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.866393 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: E0313 13:58:15.872448 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.920325 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.936552 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.948971 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.962212 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.975007 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.003263 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.021719 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.036684 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.051528 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.063794 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.075003 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.084813 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.537455 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/2.log" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.541449 4898 scope.go:117] "RemoveContainer" containerID="c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc" Mar 13 13:58:16 crc kubenswrapper[4898]: E0313 13:58:16.541691 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.564362 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.579737 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.597745 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.611996 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.626215 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.657504 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.679018 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.698935 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.719962 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.735803 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.750191 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.763138 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.779554 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.796471 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.812225 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.834011 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.858558 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:17 crc kubenswrapper[4898]: I0313 13:58:17.708184 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:17 crc kubenswrapper[4898]: E0313 13:58:17.708393 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:17 crc kubenswrapper[4898]: E0313 13:58:17.708523 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs podName:9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:33.708491293 +0000 UTC m=+148.710079572 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs") pod "network-metrics-daemon-fwrwc" (UID: "9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:17 crc kubenswrapper[4898]: I0313 13:58:17.739228 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:17 crc kubenswrapper[4898]: I0313 13:58:17.739277 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:17 crc kubenswrapper[4898]: I0313 13:58:17.739316 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:17 crc kubenswrapper[4898]: I0313 13:58:17.739253 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:17 crc kubenswrapper[4898]: E0313 13:58:17.739369 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:17 crc kubenswrapper[4898]: E0313 13:58:17.739462 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:17 crc kubenswrapper[4898]: E0313 13:58:17.739714 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:17 crc kubenswrapper[4898]: E0313 13:58:17.739808 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:19 crc kubenswrapper[4898]: I0313 13:58:19.739338 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:19 crc kubenswrapper[4898]: I0313 13:58:19.739416 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:19 crc kubenswrapper[4898]: I0313 13:58:19.739348 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:19 crc kubenswrapper[4898]: E0313 13:58:19.739529 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:19 crc kubenswrapper[4898]: I0313 13:58:19.739588 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:19 crc kubenswrapper[4898]: E0313 13:58:19.739724 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:19 crc kubenswrapper[4898]: E0313 13:58:19.739828 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:19 crc kubenswrapper[4898]: E0313 13:58:19.740004 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:20 crc kubenswrapper[4898]: E0313 13:58:20.874013 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:21 crc kubenswrapper[4898]: I0313 13:58:21.739160 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:21 crc kubenswrapper[4898]: I0313 13:58:21.739229 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:21 crc kubenswrapper[4898]: I0313 13:58:21.739229 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:21 crc kubenswrapper[4898]: E0313 13:58:21.739364 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:21 crc kubenswrapper[4898]: I0313 13:58:21.739419 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:21 crc kubenswrapper[4898]: E0313 13:58:21.739504 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:21 crc kubenswrapper[4898]: E0313 13:58:21.739560 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:21 crc kubenswrapper[4898]: E0313 13:58:21.739589 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:23 crc kubenswrapper[4898]: I0313 13:58:23.739452 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:23 crc kubenswrapper[4898]: I0313 13:58:23.739504 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:23 crc kubenswrapper[4898]: I0313 13:58:23.739585 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:23 crc kubenswrapper[4898]: E0313 13:58:23.739791 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:23 crc kubenswrapper[4898]: I0313 13:58:23.739832 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:23 crc kubenswrapper[4898]: E0313 13:58:23.740005 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:23 crc kubenswrapper[4898]: E0313 13:58:23.740161 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:23 crc kubenswrapper[4898]: E0313 13:58:23.740316 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.111526 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.111666 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.111691 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.111719 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.111742 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:25Z","lastTransitionTime":"2026-03-13T13:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.132681 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.137853 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.137942 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.137982 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.138008 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.138026 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:25Z","lastTransitionTime":"2026-03-13T13:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.156231 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.160557 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.160632 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.160649 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.160671 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.160687 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:25Z","lastTransitionTime":"2026-03-13T13:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.173742 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.177752 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.177810 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.177821 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.177839 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.177853 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:25Z","lastTransitionTime":"2026-03-13T13:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.196877 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.201654 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.201703 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.201720 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.201740 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.201754 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:25Z","lastTransitionTime":"2026-03-13T13:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.220692 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.221018 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.738988 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.739038 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.739175 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.739296 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.739550 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.739724 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.739806 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.739891 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.765309 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.782690 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.798142 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.815648 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.829919 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.843238 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.859371 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.874458 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.878232 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.901551 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.935828 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.954043 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.974242 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.993091 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:26 crc kubenswrapper[4898]: I0313 13:58:26.013010 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:26Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:26 crc kubenswrapper[4898]: I0313 13:58:26.044383 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:26Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:26 crc kubenswrapper[4898]: I0313 13:58:26.065076 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:26Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:26 crc kubenswrapper[4898]: I0313 13:58:26.084798 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:26Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:26 crc kubenswrapper[4898]: I0313 13:58:26.754463 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 13 13:58:27 crc kubenswrapper[4898]: I0313 13:58:27.739434 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:27 crc kubenswrapper[4898]: I0313 13:58:27.739487 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:27 crc kubenswrapper[4898]: I0313 13:58:27.739434 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:27 crc kubenswrapper[4898]: I0313 13:58:27.739619 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:27 crc kubenswrapper[4898]: E0313 13:58:27.739871 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:27 crc kubenswrapper[4898]: E0313 13:58:27.740028 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:27 crc kubenswrapper[4898]: E0313 13:58:27.740161 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:27 crc kubenswrapper[4898]: E0313 13:58:27.740343 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:28 crc kubenswrapper[4898]: I0313 13:58:28.741071 4898 scope.go:117] "RemoveContainer" containerID="c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc" Mar 13 13:58:28 crc kubenswrapper[4898]: E0313 13:58:28.741417 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" Mar 13 13:58:29 crc kubenswrapper[4898]: I0313 13:58:29.739204 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:29 crc kubenswrapper[4898]: I0313 13:58:29.739374 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:29 crc kubenswrapper[4898]: I0313 13:58:29.739449 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:29 crc kubenswrapper[4898]: I0313 13:58:29.739506 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:29 crc kubenswrapper[4898]: E0313 13:58:29.740000 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:29 crc kubenswrapper[4898]: E0313 13:58:29.740398 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:29 crc kubenswrapper[4898]: E0313 13:58:29.740232 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:29 crc kubenswrapper[4898]: E0313 13:58:29.740569 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:29 crc kubenswrapper[4898]: I0313 13:58:29.756805 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 13 13:58:30 crc kubenswrapper[4898]: E0313 13:58:30.876123 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:31 crc kubenswrapper[4898]: I0313 13:58:31.739231 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:31 crc kubenswrapper[4898]: I0313 13:58:31.739280 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:31 crc kubenswrapper[4898]: I0313 13:58:31.739273 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:31 crc kubenswrapper[4898]: E0313 13:58:31.739401 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:31 crc kubenswrapper[4898]: I0313 13:58:31.739523 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:31 crc kubenswrapper[4898]: E0313 13:58:31.739518 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:31 crc kubenswrapper[4898]: E0313 13:58:31.739565 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:31 crc kubenswrapper[4898]: E0313 13:58:31.739619 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:33 crc kubenswrapper[4898]: I0313 13:58:33.739148 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:33 crc kubenswrapper[4898]: I0313 13:58:33.739256 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:33 crc kubenswrapper[4898]: I0313 13:58:33.739188 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:33 crc kubenswrapper[4898]: E0313 13:58:33.739332 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:33 crc kubenswrapper[4898]: E0313 13:58:33.739387 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:33 crc kubenswrapper[4898]: I0313 13:58:33.739449 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:33 crc kubenswrapper[4898]: E0313 13:58:33.739674 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:33 crc kubenswrapper[4898]: E0313 13:58:33.739740 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:33 crc kubenswrapper[4898]: I0313 13:58:33.783187 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:33 crc kubenswrapper[4898]: E0313 13:58:33.783407 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:33 crc kubenswrapper[4898]: E0313 13:58:33.783520 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs podName:9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869 nodeName:}" failed. No retries permitted until 2026-03-13 13:59:05.783495481 +0000 UTC m=+180.785083730 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs") pod "network-metrics-daemon-fwrwc" (UID: "9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.266087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.266148 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.266167 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.266191 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.266208 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:35Z","lastTransitionTime":"2026-03-13T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.286991 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.290775 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.290837 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.290847 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.290860 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.290869 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:35Z","lastTransitionTime":"2026-03-13T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.303052 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.306475 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.306532 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.306546 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.306564 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.306597 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:35Z","lastTransitionTime":"2026-03-13T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.318969 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.322503 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.322571 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.322585 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.322603 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.322615 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:35Z","lastTransitionTime":"2026-03-13T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.335508 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.339252 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.339279 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.339288 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.339301 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.339311 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:35Z","lastTransitionTime":"2026-03-13T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.351609 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.351717 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.739034 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.739098 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.739201 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.739227 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.739261 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.739353 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.739467 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.739550 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.755384 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.772524 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.790508 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5bd6119-d7a3-4363-add5-7eb62180e1ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00526d41ebb304bac57e24a91007919427bb623e8cbce6cc25d7b1a5195871a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 13:56:35.870763 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 13:56:35.872344 1 observer_polling.go:159] Starting file observer\\\\nI0313 13:56:35.873967 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 13:56:35.875073 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 13:57:02.288879 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0313 13:57:04.933123 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 13:57:04.933351 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:35Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2524f0ae02e7d3c2569744846e2d99e8c808c9e14afeab6415910ee731794609\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511cb3e4e8a16bcdfdf5840917a0a9a5a2e2a81419278449f613c9a41d164c1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.808784 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.831340 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.847500 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.876647 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.883287 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.903252 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.919629 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.934607 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.965333 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.993830 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:36 crc kubenswrapper[4898]: I0313 13:58:36.008626 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c55972-53a1-4a22-85ab-c38bffbaf629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cc01a31ae51aefc4cc7ea6a77eea7d0885ebf0f9fdcbdfbeebfcc7e69a33755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:36Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:36 crc kubenswrapper[4898]: I0313 13:58:36.025187 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:36Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:36 crc kubenswrapper[4898]: I0313 13:58:36.045197 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:36Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:36 crc kubenswrapper[4898]: I0313 13:58:36.061161 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:36Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:36 crc kubenswrapper[4898]: I0313 13:58:36.078953 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:36Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:36 crc kubenswrapper[4898]: I0313 13:58:36.095562 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:36Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:36 crc kubenswrapper[4898]: I0313 13:58:36.110794 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:36Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.618107 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6llfs_e521c857-9711-4f68-886f-38b233d7b05b/kube-multus/0.log" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.618183 4898 generic.go:334] "Generic (PLEG): container finished" podID="e521c857-9711-4f68-886f-38b233d7b05b" containerID="de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f" exitCode=1 Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.618217 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6llfs" event={"ID":"e521c857-9711-4f68-886f-38b233d7b05b","Type":"ContainerDied","Data":"de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f"} Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.618680 4898 scope.go:117] "RemoveContainer" containerID="de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.636649 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.655601 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.677979 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.695737 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.710263 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.719986 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.734034 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.739684 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.739707 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.739780 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.740041 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:37 crc kubenswrapper[4898]: E0313 13:58:37.740267 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:37 crc kubenswrapper[4898]: E0313 13:58:37.740515 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:37 crc kubenswrapper[4898]: E0313 13:58:37.740628 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:37 crc kubenswrapper[4898]: E0313 13:58:37.740772 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.749762 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.763039 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5bd6119-d7a3-4363-add5-7eb62180e1ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00526d41ebb304bac57e24a91007919427bb623e8cbce6cc25d7b1a5195871a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 13:56:35.870763 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 13:56:35.872344 1 observer_polling.go:159] Starting file observer\\\\nI0313 13:56:35.873967 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 13:56:35.875073 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 13:57:02.288879 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0313 13:57:04.933123 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 13:57:04.933351 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:35Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2524f0ae02e7d3c2569744846e2d99e8c808c9e14afeab6415910ee731794609\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511cb3e4e8a16bcdfdf5840917a0a9a5a2e2a81419278449f613c9a41d164c1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.776377 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.790873 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.822381 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.887943 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.908327 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.925182 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.947541 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:36Z\\\",\\\"message\\\":\\\"2026-03-13T13:57:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61\\\\n2026-03-13T13:57:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61 to /host/opt/cni/bin/\\\\n2026-03-13T13:57:51Z [verbose] multus-daemon started\\\\n2026-03-13T13:57:51Z [verbose] Readiness Indicator file check\\\\n2026-03-13T13:58:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.972708 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.997329 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.015691 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c55972-53a1-4a22-85ab-c38bffbaf629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cc01a31ae51aefc4cc7ea6a77eea7d0885ebf0f9fdcbdfbeebfcc7e69a33755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.622816 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6llfs_e521c857-9711-4f68-886f-38b233d7b05b/kube-multus/0.log" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.622933 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6llfs" event={"ID":"e521c857-9711-4f68-886f-38b233d7b05b","Type":"ContainerStarted","Data":"04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770"} Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.639468 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.656723 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.682878 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.699043 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.722250 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.737937 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.754076 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.773741 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.794079 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5bd6119-d7a3-4363-add5-7eb62180e1ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00526d41ebb304bac57e24a91007919427bb623e8cbce6cc25d7b1a5195871a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 13:56:35.870763 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 13:56:35.872344 1 observer_polling.go:159] Starting file observer\\\\nI0313 13:56:35.873967 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 13:56:35.875073 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 13:57:02.288879 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0313 13:57:04.933123 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 13:57:04.933351 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:35Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2524f0ae02e7d3c2569744846e2d99e8c808c9e14afeab6415910ee731794609\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511cb3e4e8a16bcdfdf5840917a0a9a5a2e2a81419278449f613c9a41d164c1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.815739 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.839344 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.871445 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.890873 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.910237 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.928757 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.943128 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:36Z\\\",\\\"message\\\":\\\"2026-03-13T13:57:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61\\\\n2026-03-13T13:57:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61 to /host/opt/cni/bin/\\\\n2026-03-13T13:57:51Z [verbose] multus-daemon started\\\\n2026-03-13T13:57:51Z [verbose] Readiness Indicator file check\\\\n2026-03-13T13:58:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.967296 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.981551 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.994213 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c55972-53a1-4a22-85ab-c38bffbaf629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cc01a31ae51aefc4cc7ea6a77eea7d0885ebf0f9fdcbdfbeebfcc7e69a33755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:39 crc kubenswrapper[4898]: I0313 13:58:39.739257 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:39 crc kubenswrapper[4898]: I0313 13:58:39.739309 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:39 crc kubenswrapper[4898]: I0313 13:58:39.739369 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:39 crc kubenswrapper[4898]: E0313 13:58:39.739446 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:39 crc kubenswrapper[4898]: I0313 13:58:39.739459 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:39 crc kubenswrapper[4898]: E0313 13:58:39.739595 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:39 crc kubenswrapper[4898]: E0313 13:58:39.739691 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:39 crc kubenswrapper[4898]: E0313 13:58:39.739775 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:40 crc kubenswrapper[4898]: E0313 13:58:40.877831 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:41 crc kubenswrapper[4898]: I0313 13:58:41.739009 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:41 crc kubenswrapper[4898]: I0313 13:58:41.739075 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:41 crc kubenswrapper[4898]: E0313 13:58:41.739179 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:41 crc kubenswrapper[4898]: I0313 13:58:41.739282 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:41 crc kubenswrapper[4898]: E0313 13:58:41.739522 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:41 crc kubenswrapper[4898]: I0313 13:58:41.739569 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:41 crc kubenswrapper[4898]: E0313 13:58:41.739717 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:41 crc kubenswrapper[4898]: E0313 13:58:41.739869 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.693058 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.693246 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:47.693212076 +0000 UTC m=+222.694800355 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.739379 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.739500 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.739543 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.739703 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.739817 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.739845 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.740285 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.740448 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.741074 4898 scope.go:117] "RemoveContainer" containerID="c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc" Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.795141 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.795284 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.795345 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795374 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795407 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795422 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795478 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795491 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 13:59:47.795474156 +0000 UTC m=+222.797062405 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795495 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795542 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:59:47.795523107 +0000 UTC m=+222.797111386 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795518 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795574 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795622 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 13:59:47.795606429 +0000 UTC m=+222.797194708 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.795398 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795678 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795846 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:59:47.795803544 +0000 UTC m=+222.797391893 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.644431 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/3.log" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.645357 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/2.log" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.649195 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022" exitCode=1 Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.649237 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022"} Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.649267 4898 scope.go:117] "RemoveContainer" containerID="c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.650436 4898 scope.go:117] "RemoveContainer" containerID="5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022" Mar 13 13:58:44 crc kubenswrapper[4898]: E0313 13:58:44.650705 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.667469 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c55972-53a1-4a22-85ab-c38bffbaf629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cc01a31ae51aefc4cc7ea6a77eea7d0885ebf0f9fdcbdfbeebfcc7e69a33755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.681865 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.694811 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.710824 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:36Z\\\",\\\"message\\\":\\\"2026-03-13T13:57:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61\\\\n2026-03-13T13:57:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61 to /host/opt/cni/bin/\\\\n2026-03-13T13:57:51Z [verbose] multus-daemon started\\\\n2026-03-13T13:57:51Z [verbose] Readiness Indicator file check\\\\n2026-03-13T13:58:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.736343 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:44Z\\\",\\\"message\\\":\\\"94 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-fwrwc\\\\nI0313 13:58:44.575642 7494 ovn.go:134] Ensuring zone local for Pod openshift-multus/network-metrics-daemon-fwrwc in node crc\\\\nI0313 13:58:44.575664 7494 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-fwrwc] creating logical port openshift-multus_network-metrics-daemon-fwrwc for pod on switch crc\\\\nI0313 13:58:44.575715 7494 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI0313 13:58:44.575730 7494 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 2.852375ms\\\\nI0313 13:58:44.575713 7494 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF0313 13:58:44.574595 7494 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.752811 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.772456 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.788455 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.805264 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.820627 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.834291 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.852022 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.864047 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.875382 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.886667 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.903458 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5bd6119-d7a3-4363-add5-7eb62180e1ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00526d41ebb304bac57e24a91007919427bb623e8cbce6cc25d7b1a5195871a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 13:56:35.870763 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 13:56:35.872344 1 observer_polling.go:159] Starting file observer\\\\nI0313 13:56:35.873967 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 13:56:35.875073 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 13:57:02.288879 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0313 13:57:04.933123 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 13:57:04.933351 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:35Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2524f0ae02e7d3c2569744846e2d99e8c808c9e14afeab6415910ee731794609\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511cb3e4e8a16bcdfdf5840917a0a9a5a2e2a81419278449f613c9a41d164c1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.923761 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.943310 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.964363 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.557203 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.557240 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.557248 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.557260 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.557270 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:45Z","lastTransitionTime":"2026-03-13T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.576160 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.579779 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.579813 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.579822 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.579837 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.579856 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:45Z","lastTransitionTime":"2026-03-13T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.597661 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.602188 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.602251 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.602274 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.602306 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.602332 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:45Z","lastTransitionTime":"2026-03-13T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.620881 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.625304 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.625340 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.625350 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.625362 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.625372 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:45Z","lastTransitionTime":"2026-03-13T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.637270 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.640878 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.640933 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.640946 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.640964 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.640972 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:45Z","lastTransitionTime":"2026-03-13T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.651834 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.652020 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.655468 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/3.log" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.739480 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.739538 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.739594 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.739708 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.739809 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.740101 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.740437 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.740860 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.757882 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.769023 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c55972-53a1-4a22-85ab-c38bffbaf629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cc01a31ae51aefc4cc7ea6a77eea7d0885ebf0f9fdcbdfbeebfcc7e69a33755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.783090 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.799138 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.816833 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:36Z\\\",\\\"message\\\":\\\"2026-03-13T13:57:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61\\\\n2026-03-13T13:57:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61 to /host/opt/cni/bin/\\\\n2026-03-13T13:57:51Z [verbose] multus-daemon started\\\\n2026-03-13T13:57:51Z [verbose] Readiness Indicator file check\\\\n2026-03-13T13:58:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.838324 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:44Z\\\",\\\"message\\\":\\\"94 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-fwrwc\\\\nI0313 13:58:44.575642 7494 ovn.go:134] Ensuring zone local for Pod openshift-multus/network-metrics-daemon-fwrwc in node crc\\\\nI0313 13:58:44.575664 7494 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-fwrwc] creating logical port openshift-multus_network-metrics-daemon-fwrwc for pod on switch crc\\\\nI0313 13:58:44.575715 7494 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI0313 13:58:44.575730 7494 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 2.852375ms\\\\nI0313 13:58:44.575713 7494 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF0313 13:58:44.574595 7494 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.858779 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.876374 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.878491 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.891059 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.904851 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.917470 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5bd6119-d7a3-4363-add5-7eb62180e1ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00526d41ebb304bac57e24a91007919427bb623e8cbce6cc25d7b1a5195871a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 13:56:35.870763 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 13:56:35.872344 1 observer_polling.go:159] Starting file observer\\\\nI0313 13:56:35.873967 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 13:56:35.875073 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 13:57:02.288879 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0313 13:57:04.933123 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 13:57:04.933351 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:35Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2524f0ae02e7d3c2569744846e2d99e8c808c9e14afeab6415910ee731794609\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511cb3e4e8a16bcdfdf5840917a0a9a5a2e2a81419278449f613c9a41d164c1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.936231 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.954404 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.968161 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.984261 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.996293 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:46 crc kubenswrapper[4898]: I0313 13:58:46.022603 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:46Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:46 crc kubenswrapper[4898]: I0313 13:58:46.039654 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:46Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:46 crc kubenswrapper[4898]: I0313 13:58:46.056815 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:46Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:47 crc kubenswrapper[4898]: I0313 13:58:47.739004 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:47 crc kubenswrapper[4898]: I0313 13:58:47.739106 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:47 crc kubenswrapper[4898]: I0313 13:58:47.739121 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:47 crc kubenswrapper[4898]: E0313 13:58:47.739261 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:47 crc kubenswrapper[4898]: I0313 13:58:47.739531 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:47 crc kubenswrapper[4898]: E0313 13:58:47.739643 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:47 crc kubenswrapper[4898]: E0313 13:58:47.739853 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:47 crc kubenswrapper[4898]: E0313 13:58:47.740151 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.513213 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.514627 4898 scope.go:117] "RemoveContainer" containerID="5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022" Mar 13 13:58:49 crc kubenswrapper[4898]: E0313 13:58:49.514999 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.543143 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.562250 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.579176 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.592492 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.605174 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5bd6119-d7a3-4363-add5-7eb62180e1ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00526d41ebb304bac57e24a91007919427bb623e8cbce6cc25d7b1a5195871a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 13:56:35.870763 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 13:56:35.872344 1 observer_polling.go:159] Starting file observer\\\\nI0313 13:56:35.873967 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 13:56:35.875073 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 13:57:02.288879 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0313 13:57:04.933123 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 13:57:04.933351 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:35Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2524f0ae02e7d3c2569744846e2d99e8c808c9e14afeab6415910ee731794609\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511cb3e4e8a16bcdfdf5840917a0a9a5a2e2a81419278449f613c9a41d164c1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.619301 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.631792 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.657459 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.675549 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.692876 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.708001 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.729673 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:36Z\\\",\\\"message\\\":\\\"2026-03-13T13:57:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61\\\\n2026-03-13T13:57:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61 to /host/opt/cni/bin/\\\\n2026-03-13T13:57:51Z [verbose] multus-daemon started\\\\n2026-03-13T13:57:51Z [verbose] Readiness Indicator file check\\\\n2026-03-13T13:58:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.740270 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.740303 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:49 crc kubenswrapper[4898]: E0313 13:58:49.740386 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.740273 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.740413 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:49 crc kubenswrapper[4898]: E0313 13:58:49.740541 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:49 crc kubenswrapper[4898]: E0313 13:58:49.740578 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:49 crc kubenswrapper[4898]: E0313 13:58:49.740652 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.754637 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:44Z\\\",\\\"message\\\":\\\"94 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-fwrwc\\\\nI0313 13:58:44.575642 7494 ovn.go:134] Ensuring zone local for Pod openshift-multus/network-metrics-daemon-fwrwc in node crc\\\\nI0313 13:58:44.575664 7494 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-fwrwc] creating logical port openshift-multus_network-metrics-daemon-fwrwc for pod on switch crc\\\\nI0313 13:58:44.575715 7494 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI0313 13:58:44.575730 7494 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 2.852375ms\\\\nI0313 13:58:44.575713 7494 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF0313 13:58:44.574595 7494 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.772358 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.789204 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c55972-53a1-4a22-85ab-c38bffbaf629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cc01a31ae51aefc4cc7ea6a77eea7d0885ebf0f9fdcbdfbeebfcc7e69a33755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.800710 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.813219 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.829739 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.843443 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:50 crc kubenswrapper[4898]: E0313 13:58:50.879612 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:51 crc kubenswrapper[4898]: I0313 13:58:51.739180 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:51 crc kubenswrapper[4898]: I0313 13:58:51.739194 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:51 crc kubenswrapper[4898]: I0313 13:58:51.739225 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:51 crc kubenswrapper[4898]: I0313 13:58:51.739290 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:51 crc kubenswrapper[4898]: E0313 13:58:51.739413 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:51 crc kubenswrapper[4898]: E0313 13:58:51.739772 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:51 crc kubenswrapper[4898]: E0313 13:58:51.740329 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:51 crc kubenswrapper[4898]: E0313 13:58:51.740195 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:53 crc kubenswrapper[4898]: I0313 13:58:53.738962 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:53 crc kubenswrapper[4898]: I0313 13:58:53.739008 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:53 crc kubenswrapper[4898]: E0313 13:58:53.739621 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:53 crc kubenswrapper[4898]: I0313 13:58:53.739119 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:53 crc kubenswrapper[4898]: I0313 13:58:53.739064 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:53 crc kubenswrapper[4898]: E0313 13:58:53.739759 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:53 crc kubenswrapper[4898]: E0313 13:58:53.739848 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:53 crc kubenswrapper[4898]: E0313 13:58:53.740050 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.739068 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.739121 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:55 crc kubenswrapper[4898]: E0313 13:58:55.739196 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.739207 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.739254 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:55 crc kubenswrapper[4898]: E0313 13:58:55.739362 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:55 crc kubenswrapper[4898]: E0313 13:58:55.739408 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:55 crc kubenswrapper[4898]: E0313 13:58:55.739731 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.774956 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=75.774940353 podStartE2EDuration="1m15.774940353s" podCreationTimestamp="2026-03-13 13:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:55.774518494 +0000 UTC m=+170.776106753" watchObservedRunningTime="2026-03-13 13:58:55.774940353 +0000 UTC m=+170.776528592" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.793290 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=29.793268955 podStartE2EDuration="29.793268955s" podCreationTimestamp="2026-03-13 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:55.792947707 +0000 UTC m=+170.794536016" watchObservedRunningTime="2026-03-13 13:58:55.793268955 +0000 UTC m=+170.794857224" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.840065 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.840104 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.840115 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.840133 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.840144 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:55Z","lastTransitionTime":"2026-03-13T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:55 crc kubenswrapper[4898]: E0313 13:58:55.881363 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.895726 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6llfs" podStartSLOduration=104.895702889 podStartE2EDuration="1m44.895702889s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:55.848157416 +0000 UTC m=+170.849745665" watchObservedRunningTime="2026-03-13 13:58:55.895702889 +0000 UTC m=+170.897291128" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.895973 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv"] Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.896345 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.898360 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.898525 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.899857 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.900344 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.914494 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb107ebf-df14-44ee-8c21-06fd3c080f7b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.914545 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb107ebf-df14-44ee-8c21-06fd3c080f7b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.914622 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fb107ebf-df14-44ee-8c21-06fd3c080f7b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.914664 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb107ebf-df14-44ee-8c21-06fd3c080f7b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.914689 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fb107ebf-df14-44ee-8c21-06fd3c080f7b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.974998 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podStartSLOduration=104.97498271 podStartE2EDuration="1m44.97498271s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:55.97452888 +0000 UTC m=+170.976117129" watchObservedRunningTime="2026-03-13 13:58:55.97498271 +0000 UTC m=+170.976570949" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.975328 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xpbhb" podStartSLOduration=104.975323408 podStartE2EDuration="1m44.975323408s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:55.957170601 +0000 UTC m=+170.958758840" watchObservedRunningTime="2026-03-13 13:58:55.975323408 +0000 UTC m=+170.976911637" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.987260 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=26.987250022 podStartE2EDuration="26.987250022s" podCreationTimestamp="2026-03-13 13:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:55.987133989 +0000 UTC m=+170.988722248" watchObservedRunningTime="2026-03-13 13:58:55.987250022 +0000 UTC m=+170.988838261" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.015338 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb107ebf-df14-44ee-8c21-06fd3c080f7b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.015419 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fb107ebf-df14-44ee-8c21-06fd3c080f7b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.015450 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb107ebf-df14-44ee-8c21-06fd3c080f7b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.015477 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fb107ebf-df14-44ee-8c21-06fd3c080f7b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.015525 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb107ebf-df14-44ee-8c21-06fd3c080f7b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.015758 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fb107ebf-df14-44ee-8c21-06fd3c080f7b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.015885 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fb107ebf-df14-44ee-8c21-06fd3c080f7b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.016479 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb107ebf-df14-44ee-8c21-06fd3c080f7b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.024049 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb107ebf-df14-44ee-8c21-06fd3c080f7b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.035019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb107ebf-df14-44ee-8c21-06fd3c080f7b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.042525 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5qb65" podStartSLOduration=105.042503462 podStartE2EDuration="1m45.042503462s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:56.04156458 +0000 UTC m=+171.043152859" watchObservedRunningTime="2026-03-13 13:58:56.042503462 +0000 UTC m=+171.044091721" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.065844 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-b46ld" podStartSLOduration=105.065826228 podStartE2EDuration="1m45.065826228s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:56.055531361 +0000 UTC m=+171.057119640" watchObservedRunningTime="2026-03-13 13:58:56.065826228 +0000 UTC m=+171.067414467" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.076338 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" podStartSLOduration=105.076319249 podStartE2EDuration="1m45.076319249s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:56.065980071 +0000 UTC m=+171.067568320" watchObservedRunningTime="2026-03-13 13:58:56.076319249 +0000 UTC m=+171.077907488" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.102615 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=75.102598923 podStartE2EDuration="1m15.102598923s" podCreationTimestamp="2026-03-13 13:57:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:56.102018099 +0000 UTC m=+171.103606348" watchObservedRunningTime="2026-03-13 13:58:56.102598923 +0000 UTC m=+171.104187162" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.114736 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=53.114709171 podStartE2EDuration="53.114709171s" podCreationTimestamp="2026-03-13 13:58:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:56.113532214 +0000 UTC m=+171.115120473" watchObservedRunningTime="2026-03-13 13:58:56.114709171 +0000 UTC m=+171.116297440" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.212269 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: W0313 13:58:56.231072 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb107ebf_df14_44ee_8c21_06fd3c080f7b.slice/crio-392fbd59682633d8443f3dc4a7913fe45ecb995750f8d1d40f2b7688c91443e0 WatchSource:0}: Error finding container 392fbd59682633d8443f3dc4a7913fe45ecb995750f8d1d40f2b7688c91443e0: Status 404 returned error can't find the container with id 392fbd59682633d8443f3dc4a7913fe45ecb995750f8d1d40f2b7688c91443e0 Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.703576 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" event={"ID":"fb107ebf-df14-44ee-8c21-06fd3c080f7b","Type":"ContainerStarted","Data":"bb93c63d790b280fdde81d552598d5784527fe6696948be6b926c2c0aeceb7e5"} Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.703662 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" event={"ID":"fb107ebf-df14-44ee-8c21-06fd3c080f7b","Type":"ContainerStarted","Data":"392fbd59682633d8443f3dc4a7913fe45ecb995750f8d1d40f2b7688c91443e0"} Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.777354 4898 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.790263 4898 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 13:58:57 crc kubenswrapper[4898]: I0313 13:58:57.725699 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" podStartSLOduration=106.725677058 podStartE2EDuration="1m46.725677058s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:57.724644194 +0000 UTC m=+172.726232463" watchObservedRunningTime="2026-03-13 13:58:57.725677058 +0000 UTC m=+172.727265307" Mar 13 13:58:57 crc kubenswrapper[4898]: I0313 13:58:57.739124 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:57 crc kubenswrapper[4898]: I0313 13:58:57.739258 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:57 crc kubenswrapper[4898]: I0313 13:58:57.739383 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:57 crc kubenswrapper[4898]: E0313 13:58:57.739313 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:57 crc kubenswrapper[4898]: E0313 13:58:57.739540 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:57 crc kubenswrapper[4898]: E0313 13:58:57.739616 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:57 crc kubenswrapper[4898]: I0313 13:58:57.739738 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:57 crc kubenswrapper[4898]: E0313 13:58:57.739841 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:59 crc kubenswrapper[4898]: I0313 13:58:59.739503 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:59 crc kubenswrapper[4898]: I0313 13:58:59.739530 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:59 crc kubenswrapper[4898]: I0313 13:58:59.739586 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:59 crc kubenswrapper[4898]: I0313 13:58:59.739665 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:59 crc kubenswrapper[4898]: E0313 13:58:59.740644 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:59 crc kubenswrapper[4898]: E0313 13:58:59.740634 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:59 crc kubenswrapper[4898]: E0313 13:58:59.740706 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:59 crc kubenswrapper[4898]: E0313 13:58:59.740540 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:00 crc kubenswrapper[4898]: E0313 13:59:00.882287 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:59:01 crc kubenswrapper[4898]: I0313 13:59:01.739573 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:01 crc kubenswrapper[4898]: I0313 13:59:01.739652 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:01 crc kubenswrapper[4898]: I0313 13:59:01.739581 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:01 crc kubenswrapper[4898]: E0313 13:59:01.739754 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:01 crc kubenswrapper[4898]: I0313 13:59:01.739830 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:01 crc kubenswrapper[4898]: E0313 13:59:01.740132 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:01 crc kubenswrapper[4898]: E0313 13:59:01.740209 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:01 crc kubenswrapper[4898]: E0313 13:59:01.740261 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:02 crc kubenswrapper[4898]: I0313 13:59:02.740345 4898 scope.go:117] "RemoveContainer" containerID="5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022" Mar 13 13:59:02 crc kubenswrapper[4898]: E0313 13:59:02.740636 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" Mar 13 13:59:03 crc kubenswrapper[4898]: I0313 13:59:03.738926 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:03 crc kubenswrapper[4898]: I0313 13:59:03.738966 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:03 crc kubenswrapper[4898]: E0313 13:59:03.739150 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:03 crc kubenswrapper[4898]: I0313 13:59:03.739182 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:03 crc kubenswrapper[4898]: I0313 13:59:03.739240 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:03 crc kubenswrapper[4898]: E0313 13:59:03.739364 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:03 crc kubenswrapper[4898]: E0313 13:59:03.739728 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:03 crc kubenswrapper[4898]: E0313 13:59:03.739830 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:05 crc kubenswrapper[4898]: I0313 13:59:05.739252 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:05 crc kubenswrapper[4898]: I0313 13:59:05.739311 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:05 crc kubenswrapper[4898]: I0313 13:59:05.739311 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:05 crc kubenswrapper[4898]: I0313 13:59:05.739276 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:05 crc kubenswrapper[4898]: E0313 13:59:05.740593 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:05 crc kubenswrapper[4898]: E0313 13:59:05.740720 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:05 crc kubenswrapper[4898]: E0313 13:59:05.740840 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:05 crc kubenswrapper[4898]: E0313 13:59:05.740921 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:05 crc kubenswrapper[4898]: I0313 13:59:05.875474 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:05 crc kubenswrapper[4898]: E0313 13:59:05.875700 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:59:05 crc kubenswrapper[4898]: E0313 13:59:05.875789 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs podName:9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869 nodeName:}" failed. No retries permitted until 2026-03-13 14:00:09.875767585 +0000 UTC m=+244.877355834 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs") pod "network-metrics-daemon-fwrwc" (UID: "9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:59:05 crc kubenswrapper[4898]: E0313 13:59:05.882956 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:59:07 crc kubenswrapper[4898]: I0313 13:59:07.738895 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:07 crc kubenswrapper[4898]: I0313 13:59:07.739037 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:07 crc kubenswrapper[4898]: I0313 13:59:07.738937 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:07 crc kubenswrapper[4898]: E0313 13:59:07.739142 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:07 crc kubenswrapper[4898]: I0313 13:59:07.739040 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:07 crc kubenswrapper[4898]: E0313 13:59:07.739252 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:07 crc kubenswrapper[4898]: E0313 13:59:07.739382 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:07 crc kubenswrapper[4898]: E0313 13:59:07.739511 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:09 crc kubenswrapper[4898]: I0313 13:59:09.739116 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:09 crc kubenswrapper[4898]: I0313 13:59:09.739213 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:09 crc kubenswrapper[4898]: I0313 13:59:09.739242 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:09 crc kubenswrapper[4898]: E0313 13:59:09.739358 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:09 crc kubenswrapper[4898]: I0313 13:59:09.739385 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:09 crc kubenswrapper[4898]: E0313 13:59:09.739517 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:09 crc kubenswrapper[4898]: E0313 13:59:09.739678 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:09 crc kubenswrapper[4898]: E0313 13:59:09.739786 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:10 crc kubenswrapper[4898]: E0313 13:59:10.884762 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:59:11 crc kubenswrapper[4898]: I0313 13:59:11.739444 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:11 crc kubenswrapper[4898]: I0313 13:59:11.739641 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:11 crc kubenswrapper[4898]: E0313 13:59:11.739818 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:11 crc kubenswrapper[4898]: I0313 13:59:11.739867 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:11 crc kubenswrapper[4898]: I0313 13:59:11.739846 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:11 crc kubenswrapper[4898]: E0313 13:59:11.740000 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:11 crc kubenswrapper[4898]: E0313 13:59:11.740163 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:11 crc kubenswrapper[4898]: E0313 13:59:11.740340 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:13 crc kubenswrapper[4898]: I0313 13:59:13.739303 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:13 crc kubenswrapper[4898]: I0313 13:59:13.739418 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:13 crc kubenswrapper[4898]: I0313 13:59:13.739303 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:13 crc kubenswrapper[4898]: E0313 13:59:13.739501 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:13 crc kubenswrapper[4898]: E0313 13:59:13.740071 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:13 crc kubenswrapper[4898]: I0313 13:59:13.740087 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:13 crc kubenswrapper[4898]: E0313 13:59:13.740259 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:13 crc kubenswrapper[4898]: E0313 13:59:13.740396 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:13 crc kubenswrapper[4898]: I0313 13:59:13.740619 4898 scope.go:117] "RemoveContainer" containerID="5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022" Mar 13 13:59:13 crc kubenswrapper[4898]: E0313 13:59:13.740882 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" Mar 13 13:59:15 crc kubenswrapper[4898]: I0313 13:59:15.738582 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:15 crc kubenswrapper[4898]: E0313 13:59:15.740451 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:15 crc kubenswrapper[4898]: I0313 13:59:15.740479 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:15 crc kubenswrapper[4898]: I0313 13:59:15.740544 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:15 crc kubenswrapper[4898]: I0313 13:59:15.740572 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:15 crc kubenswrapper[4898]: E0313 13:59:15.740955 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:15 crc kubenswrapper[4898]: E0313 13:59:15.741189 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:15 crc kubenswrapper[4898]: E0313 13:59:15.741372 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:15 crc kubenswrapper[4898]: E0313 13:59:15.885571 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:59:17 crc kubenswrapper[4898]: I0313 13:59:17.739316 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:17 crc kubenswrapper[4898]: I0313 13:59:17.739350 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:17 crc kubenswrapper[4898]: I0313 13:59:17.739436 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:17 crc kubenswrapper[4898]: E0313 13:59:17.739625 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:17 crc kubenswrapper[4898]: I0313 13:59:17.739648 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:17 crc kubenswrapper[4898]: E0313 13:59:17.739760 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:17 crc kubenswrapper[4898]: E0313 13:59:17.739984 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:17 crc kubenswrapper[4898]: E0313 13:59:17.740051 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:19 crc kubenswrapper[4898]: I0313 13:59:19.738732 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:19 crc kubenswrapper[4898]: I0313 13:59:19.738827 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:19 crc kubenswrapper[4898]: E0313 13:59:19.738987 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:19 crc kubenswrapper[4898]: I0313 13:59:19.739081 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:19 crc kubenswrapper[4898]: E0313 13:59:19.739246 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:19 crc kubenswrapper[4898]: E0313 13:59:19.739316 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:19 crc kubenswrapper[4898]: I0313 13:59:19.740022 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:19 crc kubenswrapper[4898]: E0313 13:59:19.740234 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:20 crc kubenswrapper[4898]: E0313 13:59:20.887406 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:59:21 crc kubenswrapper[4898]: I0313 13:59:21.738825 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:21 crc kubenswrapper[4898]: I0313 13:59:21.738983 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:21 crc kubenswrapper[4898]: I0313 13:59:21.739069 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:21 crc kubenswrapper[4898]: E0313 13:59:21.739005 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:21 crc kubenswrapper[4898]: I0313 13:59:21.739100 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:21 crc kubenswrapper[4898]: E0313 13:59:21.739177 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:21 crc kubenswrapper[4898]: E0313 13:59:21.739254 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:21 crc kubenswrapper[4898]: E0313 13:59:21.739313 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.738526 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.738601 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.738645 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.738704 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:23 crc kubenswrapper[4898]: E0313 13:59:23.739302 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:23 crc kubenswrapper[4898]: E0313 13:59:23.739475 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:23 crc kubenswrapper[4898]: E0313 13:59:23.739588 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:23 crc kubenswrapper[4898]: E0313 13:59:23.739752 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.794179 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6llfs_e521c857-9711-4f68-886f-38b233d7b05b/kube-multus/1.log" Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.795068 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6llfs_e521c857-9711-4f68-886f-38b233d7b05b/kube-multus/0.log" Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.795137 4898 generic.go:334] "Generic (PLEG): container finished" podID="e521c857-9711-4f68-886f-38b233d7b05b" containerID="04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770" exitCode=1 Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.795186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6llfs" event={"ID":"e521c857-9711-4f68-886f-38b233d7b05b","Type":"ContainerDied","Data":"04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770"} Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.795263 4898 scope.go:117] "RemoveContainer" containerID="de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f" Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.795681 4898 scope.go:117] "RemoveContainer" containerID="04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770" Mar 13 13:59:23 crc kubenswrapper[4898]: E0313 13:59:23.795899 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-6llfs_openshift-multus(e521c857-9711-4f68-886f-38b233d7b05b)\"" pod="openshift-multus/multus-6llfs" podUID="e521c857-9711-4f68-886f-38b233d7b05b" Mar 13 13:59:24 crc kubenswrapper[4898]: I0313 13:59:24.801689 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6llfs_e521c857-9711-4f68-886f-38b233d7b05b/kube-multus/1.log" Mar 13 13:59:25 crc kubenswrapper[4898]: I0313 13:59:25.739480 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:25 crc kubenswrapper[4898]: I0313 13:59:25.739567 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:25 crc kubenswrapper[4898]: E0313 13:59:25.739655 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:25 crc kubenswrapper[4898]: I0313 13:59:25.739668 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:25 crc kubenswrapper[4898]: I0313 13:59:25.739727 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:25 crc kubenswrapper[4898]: E0313 13:59:25.739972 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:25 crc kubenswrapper[4898]: E0313 13:59:25.740093 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:25 crc kubenswrapper[4898]: E0313 13:59:25.742025 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:25 crc kubenswrapper[4898]: E0313 13:59:25.888099 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:59:27 crc kubenswrapper[4898]: I0313 13:59:27.738938 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:27 crc kubenswrapper[4898]: I0313 13:59:27.739001 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:27 crc kubenswrapper[4898]: I0313 13:59:27.739168 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:27 crc kubenswrapper[4898]: E0313 13:59:27.739318 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:27 crc kubenswrapper[4898]: I0313 13:59:27.739365 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:27 crc kubenswrapper[4898]: E0313 13:59:27.739630 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:27 crc kubenswrapper[4898]: E0313 13:59:27.739738 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:27 crc kubenswrapper[4898]: E0313 13:59:27.740294 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:27 crc kubenswrapper[4898]: I0313 13:59:27.741050 4898 scope.go:117] "RemoveContainer" containerID="5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022" Mar 13 13:59:28 crc kubenswrapper[4898]: I0313 13:59:28.599043 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fwrwc"] Mar 13 13:59:28 crc kubenswrapper[4898]: I0313 13:59:28.599144 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:28 crc kubenswrapper[4898]: E0313 13:59:28.599239 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:28 crc kubenswrapper[4898]: I0313 13:59:28.817287 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/3.log" Mar 13 13:59:28 crc kubenswrapper[4898]: I0313 13:59:28.820953 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed"} Mar 13 13:59:28 crc kubenswrapper[4898]: I0313 13:59:28.821456 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:59:29 crc kubenswrapper[4898]: I0313 13:59:29.738793 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:29 crc kubenswrapper[4898]: I0313 13:59:29.738859 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:29 crc kubenswrapper[4898]: E0313 13:59:29.738957 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:29 crc kubenswrapper[4898]: I0313 13:59:29.738996 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:29 crc kubenswrapper[4898]: E0313 13:59:29.739077 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:29 crc kubenswrapper[4898]: I0313 13:59:29.739138 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:29 crc kubenswrapper[4898]: E0313 13:59:29.739152 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:29 crc kubenswrapper[4898]: E0313 13:59:29.739347 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:30 crc kubenswrapper[4898]: E0313 13:59:30.890480 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:59:31 crc kubenswrapper[4898]: I0313 13:59:31.739415 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:31 crc kubenswrapper[4898]: I0313 13:59:31.739456 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:31 crc kubenswrapper[4898]: I0313 13:59:31.739573 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:31 crc kubenswrapper[4898]: E0313 13:59:31.739731 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:31 crc kubenswrapper[4898]: I0313 13:59:31.739772 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:31 crc kubenswrapper[4898]: E0313 13:59:31.739978 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:31 crc kubenswrapper[4898]: E0313 13:59:31.740089 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:31 crc kubenswrapper[4898]: E0313 13:59:31.740292 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:33 crc kubenswrapper[4898]: I0313 13:59:33.739539 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:33 crc kubenswrapper[4898]: I0313 13:59:33.739563 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:33 crc kubenswrapper[4898]: I0313 13:59:33.739892 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:33 crc kubenswrapper[4898]: I0313 13:59:33.739895 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:33 crc kubenswrapper[4898]: E0313 13:59:33.740087 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:33 crc kubenswrapper[4898]: E0313 13:59:33.740266 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:33 crc kubenswrapper[4898]: E0313 13:59:33.740433 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:33 crc kubenswrapper[4898]: E0313 13:59:33.740542 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:34 crc kubenswrapper[4898]: I0313 13:59:34.739695 4898 scope.go:117] "RemoveContainer" containerID="04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770" Mar 13 13:59:34 crc kubenswrapper[4898]: I0313 13:59:34.767157 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podStartSLOduration=143.767130112 podStartE2EDuration="2m23.767130112s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:28.85283923 +0000 UTC m=+203.854427499" watchObservedRunningTime="2026-03-13 13:59:34.767130112 +0000 UTC m=+209.768718391" Mar 13 13:59:35 crc kubenswrapper[4898]: I0313 13:59:35.738777 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:35 crc kubenswrapper[4898]: I0313 13:59:35.738850 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:35 crc kubenswrapper[4898]: I0313 13:59:35.738937 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:35 crc kubenswrapper[4898]: E0313 13:59:35.739041 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:35 crc kubenswrapper[4898]: I0313 13:59:35.739058 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:35 crc kubenswrapper[4898]: E0313 13:59:35.741883 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:35 crc kubenswrapper[4898]: E0313 13:59:35.742069 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:35 crc kubenswrapper[4898]: E0313 13:59:35.742099 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:35 crc kubenswrapper[4898]: I0313 13:59:35.851652 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6llfs_e521c857-9711-4f68-886f-38b233d7b05b/kube-multus/1.log" Mar 13 13:59:35 crc kubenswrapper[4898]: I0313 13:59:35.851731 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6llfs" event={"ID":"e521c857-9711-4f68-886f-38b233d7b05b","Type":"ContainerStarted","Data":"725f30c48676665ebc628a8b35e81161dc13d717e27cad14806022f5ad267e0e"} Mar 13 13:59:35 crc kubenswrapper[4898]: E0313 13:59:35.891265 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:59:37 crc kubenswrapper[4898]: I0313 13:59:37.738492 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:37 crc kubenswrapper[4898]: I0313 13:59:37.738546 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:37 crc kubenswrapper[4898]: I0313 13:59:37.738582 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:37 crc kubenswrapper[4898]: I0313 13:59:37.738648 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:37 crc kubenswrapper[4898]: E0313 13:59:37.739120 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:37 crc kubenswrapper[4898]: E0313 13:59:37.739240 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:37 crc kubenswrapper[4898]: E0313 13:59:37.739394 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:37 crc kubenswrapper[4898]: E0313 13:59:37.739568 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:39 crc kubenswrapper[4898]: I0313 13:59:39.739576 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:39 crc kubenswrapper[4898]: I0313 13:59:39.739595 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:39 crc kubenswrapper[4898]: I0313 13:59:39.739700 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:39 crc kubenswrapper[4898]: E0313 13:59:39.739712 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:39 crc kubenswrapper[4898]: E0313 13:59:39.739863 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:39 crc kubenswrapper[4898]: E0313 13:59:39.739950 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:39 crc kubenswrapper[4898]: I0313 13:59:39.740087 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:39 crc kubenswrapper[4898]: E0313 13:59:39.740150 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.738593 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.738667 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.738686 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.738694 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.741614 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.742186 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.742270 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.742290 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.742357 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.745503 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.217813 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.267888 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v9lxv"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.268390 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.268949 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.269472 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.279416 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-whtgq"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.280153 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.282590 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.283820 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.284498 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-cx59b"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.285291 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-cx59b" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.288649 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.289408 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.292565 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.292838 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.293066 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.293970 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.294747 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.310322 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.310689 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.311140 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.311563 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-7l2pm"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.311748 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.312097 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.312272 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.312370 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.313723 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.313464 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.324776 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.325386 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.325547 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.325674 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.325577 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.325770 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.325821 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.325929 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.326134 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.326709 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.328110 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.328601 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.329088 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.329245 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.329376 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.329591 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.329777 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.329871 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.330055 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.330473 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.330508 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.330608 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.330677 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pvbpt"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.330696 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.331023 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.333122 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.334456 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-6plhg"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.334962 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.335249 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.335473 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.335708 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.339773 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.340412 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.348207 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.348436 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.348757 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.348827 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kgnxj"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.349120 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.349188 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350104 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-machine-approver-tls\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350137 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-config\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350163 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/096d3786-85e8-4fe5-82b3-57cd1be251a1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350190 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/096d3786-85e8-4fe5-82b3-57cd1be251a1-images\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350211 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4cvb\" (UniqueName: \"kubernetes.io/projected/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-kube-api-access-j4cvb\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350236 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f12557e-02f5-4445-988f-b19f16672e3b-serving-cert\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350255 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350305 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-config\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350334 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-auth-proxy-config\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350355 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8rx7x\" (UID: \"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350376 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-service-ca-bundle\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350400 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4zh6\" (UniqueName: \"kubernetes.io/projected/a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8-kube-api-access-x4zh6\") pod \"cluster-samples-operator-665b6dd947-8rx7x\" (UID: \"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350431 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75725\" (UniqueName: \"kubernetes.io/projected/096d3786-85e8-4fe5-82b3-57cd1be251a1-kube-api-access-75725\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350453 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/096d3786-85e8-4fe5-82b3-57cd1be251a1-config\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350475 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjc96\" (UniqueName: \"kubernetes.io/projected/6f12557e-02f5-4445-988f-b19f16672e3b-kube-api-access-pjc96\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.352487 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.352625 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.352917 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353046 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353155 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353288 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353384 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353671 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353776 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353801 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353851 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353884 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353934 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353941 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353983 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.354012 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.354140 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.354175 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.354384 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.354832 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.354957 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.354976 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355071 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355194 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355322 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355421 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355533 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355541 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355570 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355604 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355634 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355660 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355700 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355733 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355819 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.356051 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.366521 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t2s2h"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.367000 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z5vf2"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.367635 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.367682 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-m2ntx"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.368220 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.368408 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.371022 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6n228"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.371586 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.372780 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.373022 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.373045 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.373172 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.373465 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.374097 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.374376 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.374388 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.374468 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.374494 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.374503 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.374568 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.374569 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-djn5q"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.379423 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.380090 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.381884 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.382159 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.382194 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.382162 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.382452 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.382803 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.383039 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.383052 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.383197 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.383285 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.384192 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.384275 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.384588 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.385806 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.386943 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.387026 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.396743 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.398842 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.399280 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.399734 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.399810 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.400290 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.401501 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.401650 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.402211 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.403306 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.403487 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.405491 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.404209 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.404385 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.406373 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.408523 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.409191 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.410515 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.411791 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.412578 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8r99"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.412891 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.415220 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xglpf"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.415330 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.416578 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.416684 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.417453 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.418805 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.419712 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.419877 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.420302 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.420382 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.420969 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z5r8j"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.421048 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.421460 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cvbms"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.421541 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.422471 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.422786 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7l2pm"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.422807 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556838-h7pkr"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.422961 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.423236 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.423624 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.423892 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.424304 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.424731 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.425617 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.426521 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-whtgq"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.427492 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v9lxv"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.428480 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rd22p"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.429653 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.431255 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kgnxj"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.431407 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.433087 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.434679 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t2s2h"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.435779 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.438954 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-cx59b"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.441991 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.443422 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.444708 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pvbpt"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.445753 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.446792 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.448285 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.449591 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.450541 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.450974 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/096d3786-85e8-4fe5-82b3-57cd1be251a1-images\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451001 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/096d3786-85e8-4fe5-82b3-57cd1be251a1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451022 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-serving-cert\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451039 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42j8n\" (UniqueName: \"kubernetes.io/projected/a402522c-e891-477d-a2cc-5aa7c6944e06-kube-api-access-42j8n\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451057 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4cvb\" (UniqueName: \"kubernetes.io/projected/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-kube-api-access-j4cvb\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451075 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451090 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-service-ca\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451104 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fp6m\" (UniqueName: \"kubernetes.io/projected/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-kube-api-access-8fp6m\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451120 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451137 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f446713d-03e3-461f-989f-eb6bdef32b30-node-pullsecrets\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451152 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqfg5\" (UniqueName: \"kubernetes.io/projected/eedd2260-f339-4e2f-83e8-13a56cee2ce6-kube-api-access-gqfg5\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451166 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451186 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f12557e-02f5-4445-988f-b19f16672e3b-serving-cert\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451200 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksjvx\" (UniqueName: \"kubernetes.io/projected/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-kube-api-access-ksjvx\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451214 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a402522c-e891-477d-a2cc-5aa7c6944e06-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451229 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451244 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451257 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-ca\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451273 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451287 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-audit\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451310 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a402522c-e891-477d-a2cc-5aa7c6944e06-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451325 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-client-ca\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451342 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-default-certificate\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451364 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-dir\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451385 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-config\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451402 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8w4m\" (UniqueName: \"kubernetes.io/projected/cfd8810f-79f1-4634-9e4d-245348fba016-kube-api-access-h8w4m\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451424 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451440 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-auth-proxy-config\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451457 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfd8810f-79f1-4634-9e4d-245348fba016-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451475 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-encryption-config\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451492 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8rx7x\" (UID: \"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451508 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-policies\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451524 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451541 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-service-ca-bundle\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451557 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzr6p\" (UniqueName: \"kubernetes.io/projected/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-kube-api-access-lzr6p\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451575 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451613 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzgzk\" (UniqueName: \"kubernetes.io/projected/b26a4d77-f170-467e-ad96-4741cc5a8f23-kube-api-access-vzgzk\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451632 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-config\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451649 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkqfk\" (UniqueName: \"kubernetes.io/projected/1607f924-1e24-4848-b811-21ac3a7f8999-kube-api-access-bkqfk\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451663 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-service-ca-bundle\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451695 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-client\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451709 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1607f924-1e24-4848-b811-21ac3a7f8999-serving-cert\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451724 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4zh6\" (UniqueName: \"kubernetes.io/projected/a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8-kube-api-access-x4zh6\") pod \"cluster-samples-operator-665b6dd947-8rx7x\" (UID: \"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451740 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-serving-cert\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451756 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451772 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/071d8651-2a2d-4eed-9023-cfe636be09a0-metrics-tls\") pod \"dns-operator-744455d44c-m2ntx\" (UID: \"071d8651-2a2d-4eed-9023-cfe636be09a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451785 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-stats-auth\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451806 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eedd2260-f339-4e2f-83e8-13a56cee2ce6-config\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451821 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75725\" (UniqueName: \"kubernetes.io/projected/096d3786-85e8-4fe5-82b3-57cd1be251a1-kube-api-access-75725\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451837 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451852 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgrh2\" (UniqueName: \"kubernetes.io/projected/071d8651-2a2d-4eed-9023-cfe636be09a0-kube-api-access-kgrh2\") pod \"dns-operator-744455d44c-m2ntx\" (UID: \"071d8651-2a2d-4eed-9023-cfe636be09a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451869 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451883 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6bvm\" (UniqueName: \"kubernetes.io/projected/f446713d-03e3-461f-989f-eb6bdef32b30-kube-api-access-r6bvm\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451912 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-metrics-certs\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451927 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451942 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/096d3786-85e8-4fe5-82b3-57cd1be251a1-config\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451958 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eedd2260-f339-4e2f-83e8-13a56cee2ce6-trusted-ca\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451974 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-etcd-client\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451987 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-config\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452004 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjc96\" (UniqueName: \"kubernetes.io/projected/6f12557e-02f5-4445-988f-b19f16672e3b-kube-api-access-pjc96\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452019 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd8810f-79f1-4634-9e4d-245348fba016-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452033 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452048 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-trusted-ca\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452063 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452078 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-image-import-ca\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452104 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-metrics-tls\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452119 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgtzj\" (UniqueName: \"kubernetes.io/projected/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-kube-api-access-rgtzj\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452133 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-config\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452146 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452161 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eedd2260-f339-4e2f-83e8-13a56cee2ce6-serving-cert\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452174 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-etcd-serving-ca\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452188 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f446713d-03e3-461f-989f-eb6bdef32b30-audit-dir\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452207 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-machine-approver-tls\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452222 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-config\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452723 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-config\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.453273 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/096d3786-85e8-4fe5-82b3-57cd1be251a1-images\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.453414 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.453464 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.453476 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.455116 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.456252 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z5vf2"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.457130 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/096d3786-85e8-4fe5-82b3-57cd1be251a1-config\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.457298 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-config\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.457514 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6n228"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.458143 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-auth-proxy-config\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.459060 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.459050 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.459243 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.459879 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/096d3786-85e8-4fe5-82b3-57cd1be251a1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.460105 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8rx7x\" (UID: \"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.460248 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-m2ntx"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.461187 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z5r8j"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.462166 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8r99"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.463133 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.463565 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-service-ca-bundle\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.464539 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.465547 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xglpf"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.466553 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cvbms"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.467667 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.468654 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f12557e-02f5-4445-988f-b19f16672e3b-serving-cert\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.468659 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-machine-approver-tls\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.469173 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556838-h7pkr"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.470343 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-djn5q"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.471339 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.471848 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rd22p"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.472973 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.474412 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.475575 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hqcs6"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.476796 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.480657 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-mr499"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.481065 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.482229 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hqcs6"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.498279 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.511754 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.512729 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2ps4n"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.513285 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2ps4n" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.520737 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2ps4n"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.532020 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552085 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552597 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f446713d-03e3-461f-989f-eb6bdef32b30-audit-dir\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552637 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42j8n\" (UniqueName: \"kubernetes.io/projected/a402522c-e891-477d-a2cc-5aa7c6944e06-kube-api-access-42j8n\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552655 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-serving-cert\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552680 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552696 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-service-ca\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552701 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f446713d-03e3-461f-989f-eb6bdef32b30-audit-dir\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552711 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fp6m\" (UniqueName: \"kubernetes.io/projected/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-kube-api-access-8fp6m\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552772 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552836 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f446713d-03e3-461f-989f-eb6bdef32b30-node-pullsecrets\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552866 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqfg5\" (UniqueName: \"kubernetes.io/projected/eedd2260-f339-4e2f-83e8-13a56cee2ce6-kube-api-access-gqfg5\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552891 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552929 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f446713d-03e3-461f-989f-eb6bdef32b30-node-pullsecrets\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552955 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksjvx\" (UniqueName: \"kubernetes.io/projected/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-kube-api-access-ksjvx\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552979 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a402522c-e891-477d-a2cc-5aa7c6944e06-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553003 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553026 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-ca\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553049 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553069 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-audit\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553112 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a402522c-e891-477d-a2cc-5aa7c6944e06-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553134 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-client-ca\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553159 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-default-certificate\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553197 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-dir\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553222 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8w4m\" (UniqueName: \"kubernetes.io/projected/cfd8810f-79f1-4634-9e4d-245348fba016-kube-api-access-h8w4m\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553263 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553298 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfd8810f-79f1-4634-9e4d-245348fba016-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553321 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-encryption-config\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553366 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-policies\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553389 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553410 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-service-ca-bundle\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553435 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzr6p\" (UniqueName: \"kubernetes.io/projected/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-kube-api-access-lzr6p\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553460 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553486 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzgzk\" (UniqueName: \"kubernetes.io/projected/b26a4d77-f170-467e-ad96-4741cc5a8f23-kube-api-access-vzgzk\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553507 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-config\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553528 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkqfk\" (UniqueName: \"kubernetes.io/projected/1607f924-1e24-4848-b811-21ac3a7f8999-kube-api-access-bkqfk\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553551 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553579 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-client\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553581 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a402522c-e891-477d-a2cc-5aa7c6944e06-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553599 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1607f924-1e24-4848-b811-21ac3a7f8999-serving-cert\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553630 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-service-ca\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553632 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-serving-cert\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553681 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553687 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-ca\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553699 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/071d8651-2a2d-4eed-9023-cfe636be09a0-metrics-tls\") pod \"dns-operator-744455d44c-m2ntx\" (UID: \"071d8651-2a2d-4eed-9023-cfe636be09a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553730 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-stats-auth\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553771 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eedd2260-f339-4e2f-83e8-13a56cee2ce6-config\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553799 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553817 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgrh2\" (UniqueName: \"kubernetes.io/projected/071d8651-2a2d-4eed-9023-cfe636be09a0-kube-api-access-kgrh2\") pod \"dns-operator-744455d44c-m2ntx\" (UID: \"071d8651-2a2d-4eed-9023-cfe636be09a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553837 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553853 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6bvm\" (UniqueName: \"kubernetes.io/projected/f446713d-03e3-461f-989f-eb6bdef32b30-kube-api-access-r6bvm\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553867 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-metrics-certs\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553883 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553922 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eedd2260-f339-4e2f-83e8-13a56cee2ce6-trusted-ca\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553936 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-etcd-client\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553950 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-config\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553974 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd8810f-79f1-4634-9e4d-245348fba016-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553989 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554004 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-trusted-ca\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554025 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554041 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-image-import-ca\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554064 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-metrics-tls\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554086 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgtzj\" (UniqueName: \"kubernetes.io/projected/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-kube-api-access-rgtzj\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554111 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-config\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554134 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554156 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eedd2260-f339-4e2f-83e8-13a56cee2ce6-serving-cert\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554174 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-etcd-serving-ca\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554277 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-audit\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554324 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-dir\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554979 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-client-ca\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.555229 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-serving-cert\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.555510 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-etcd-serving-ca\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.557814 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eedd2260-f339-4e2f-83e8-13a56cee2ce6-trusted-ca\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.558553 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-encryption-config\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.558672 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-service-ca-bundle\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.558782 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-serving-cert\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.559524 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfd8810f-79f1-4634-9e4d-245348fba016-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.559804 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-etcd-client\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.559890 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.560086 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-stats-auth\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.560759 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd8810f-79f1-4634-9e4d-245348fba016-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.560971 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-client\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.561236 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eedd2260-f339-4e2f-83e8-13a56cee2ce6-config\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.561605 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-config\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.562269 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-image-import-ca\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.562371 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-default-certificate\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.562664 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.563128 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a402522c-e891-477d-a2cc-5aa7c6944e06-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.563262 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.563541 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eedd2260-f339-4e2f-83e8-13a56cee2ce6-serving-cert\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.563747 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1607f924-1e24-4848-b811-21ac3a7f8999-serving-cert\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.565003 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-config\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.566417 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-metrics-certs\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.568694 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-trusted-ca\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.570135 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-metrics-tls\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.571534 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.611254 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.616486 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/071d8651-2a2d-4eed-9023-cfe636be09a0-metrics-tls\") pod \"dns-operator-744455d44c-m2ntx\" (UID: \"071d8651-2a2d-4eed-9023-cfe636be09a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.632029 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.651759 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.661065 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-config\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.671978 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.673669 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.691339 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.695771 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.717863 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.725646 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.735768 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.746547 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.751471 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.755701 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:47 crc kubenswrapper[4898]: E0313 13:59:47.756239 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 14:01:49.75609764 +0000 UTC m=+344.757685919 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.762706 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.779937 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.787013 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.791588 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.811384 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.825135 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.831543 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.851541 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.857494 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.857526 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.857545 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.857568 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.858677 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.861592 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-policies\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.861784 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.862306 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.862530 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.872067 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.877070 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.892687 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.897449 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.912423 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.923822 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.931609 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.952767 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.958592 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.992117 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.011874 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.032507 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.052170 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.063890 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.071558 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.075334 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.080655 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.091937 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.111110 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.132201 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.152733 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.173386 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.192501 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.212445 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.232801 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.251399 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.274681 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.304419 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: W0313 13:59:48.316068 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-bc8fedc4dfcb18561dbb855f3f92d636901e36d2e1152a743ae6370dca5742ad WatchSource:0}: Error finding container bc8fedc4dfcb18561dbb855f3f92d636901e36d2e1152a743ae6370dca5742ad: Status 404 returned error can't find the container with id bc8fedc4dfcb18561dbb855f3f92d636901e36d2e1152a743ae6370dca5742ad Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.317192 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.331797 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.351966 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 13:59:48 crc kubenswrapper[4898]: W0313 13:59:48.359738 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-d32d55c6f7feceb2714fa2f5fdd13c5fa15b68553291e50c6eae9aef459d0067 WatchSource:0}: Error finding container d32d55c6f7feceb2714fa2f5fdd13c5fa15b68553291e50c6eae9aef459d0067: Status 404 returned error can't find the container with id d32d55c6f7feceb2714fa2f5fdd13c5fa15b68553291e50c6eae9aef459d0067 Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.372277 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.392749 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.411649 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.429828 4898 request.go:700] Waited for 1.015170279s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/configmaps?fieldSelector=metadata.name%3Dmachine-config-operator-images&limit=500&resourceVersion=0 Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.431176 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.452625 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.472644 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.491800 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.516864 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.531415 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.551164 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.571168 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.591699 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.612098 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.631379 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.651431 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.672095 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.692020 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.712048 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.732090 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.752832 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.772555 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.793664 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.813355 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.833201 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.852572 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.873144 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.892721 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.907303 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"057a32285031b97e8136a089f968663acc6f61e7493bd8c07413977c3178b92b"} Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.907379 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"bc8fedc4dfcb18561dbb855f3f92d636901e36d2e1152a743ae6370dca5742ad"} Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.909536 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c0ca998f58d0078098893b001e96d917adab7933ab6e364aad678ddf2942fdf0"} Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.909560 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d32d55c6f7feceb2714fa2f5fdd13c5fa15b68553291e50c6eae9aef459d0067"} Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.909859 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.912080 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.913681 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6ffff5ce359da09accce3736b45d0b52d9dd016da11fb2a399d00a204cb51e15"} Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.913779 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a925ab650703812668df21ce8bfb6cb4e7119903285412289b894645d90a70e5"} Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.931348 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.952369 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.971395 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.992354 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.011204 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.032524 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.050916 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.071589 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.091787 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.112307 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.132309 4898 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.134158 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.134211 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.151880 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.186711 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4cvb\" (UniqueName: \"kubernetes.io/projected/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-kube-api-access-j4cvb\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.218179 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75725\" (UniqueName: \"kubernetes.io/projected/096d3786-85e8-4fe5-82b3-57cd1be251a1-kube-api-access-75725\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.239308 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjc96\" (UniqueName: \"kubernetes.io/projected/6f12557e-02f5-4445-988f-b19f16672e3b-kube-api-access-pjc96\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.245206 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4zh6\" (UniqueName: \"kubernetes.io/projected/a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8-kube-api-access-x4zh6\") pod \"cluster-samples-operator-665b6dd947-8rx7x\" (UID: \"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.253407 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.271406 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.291562 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.310629 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.331568 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.351597 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.372604 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.391786 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.412011 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.426864 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.434707 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 13:59:49 crc kubenswrapper[4898]: W0313 13:59:49.441697 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podade74420_c7a1_4b89_b6c8_7970d7b6c17c.slice/crio-c3bb23c34123cbab89cddad63cb25b898b7ddce6417f31ab796a400379737021 WatchSource:0}: Error finding container c3bb23c34123cbab89cddad63cb25b898b7ddce6417f31ab796a400379737021: Status 404 returned error can't find the container with id c3bb23c34123cbab89cddad63cb25b898b7ddce6417f31ab796a400379737021 Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.450487 4898 request.go:700] Waited for 1.897685154s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.451638 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.476679 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42j8n\" (UniqueName: \"kubernetes.io/projected/a402522c-e891-477d-a2cc-5aa7c6944e06-kube-api-access-42j8n\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.484643 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.488499 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fp6m\" (UniqueName: \"kubernetes.io/projected/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-kube-api-access-8fp6m\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.508526 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.511890 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqfg5\" (UniqueName: \"kubernetes.io/projected/eedd2260-f339-4e2f-83e8-13a56cee2ce6-kube-api-access-gqfg5\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.523379 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.527283 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksjvx\" (UniqueName: \"kubernetes.io/projected/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-kube-api-access-ksjvx\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.548384 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgtzj\" (UniqueName: \"kubernetes.io/projected/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-kube-api-access-rgtzj\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.563263 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.568774 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzr6p\" (UniqueName: \"kubernetes.io/projected/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-kube-api-access-lzr6p\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.586137 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6bvm\" (UniqueName: \"kubernetes.io/projected/f446713d-03e3-461f-989f-eb6bdef32b30-kube-api-access-r6bvm\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.607835 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.609737 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.618803 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.626357 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.628523 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkqfk\" (UniqueName: \"kubernetes.io/projected/1607f924-1e24-4848-b811-21ac3a7f8999-kube-api-access-bkqfk\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.641336 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v9lxv"] Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.641470 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:49 crc kubenswrapper[4898]: W0313 13:59:49.643866 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18e5c8bf_9fe0_465e_af8f_9e7ec7400be8.slice/crio-07d462c95de220b26b61c71fe04f7db640c2a0648bbcb012450f83b22cbe5d4a WatchSource:0}: Error finding container 07d462c95de220b26b61c71fe04f7db640c2a0648bbcb012450f83b22cbe5d4a: Status 404 returned error can't find the container with id 07d462c95de220b26b61c71fe04f7db640c2a0648bbcb012450f83b22cbe5d4a Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.646856 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzgzk\" (UniqueName: \"kubernetes.io/projected/b26a4d77-f170-467e-ad96-4741cc5a8f23-kube-api-access-vzgzk\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.650537 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.665883 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgrh2\" (UniqueName: \"kubernetes.io/projected/071d8651-2a2d-4eed-9023-cfe636be09a0-kube-api-access-kgrh2\") pod \"dns-operator-744455d44c-m2ntx\" (UID: \"071d8651-2a2d-4eed-9023-cfe636be09a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.689172 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.694224 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.698688 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.709445 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.714559 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8w4m\" (UniqueName: \"kubernetes.io/projected/cfd8810f-79f1-4634-9e4d-245348fba016-kube-api-access-h8w4m\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.715979 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-whtgq"] Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792014 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792318 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792333 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792349 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-oauth-config\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792365 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-encryption-config\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792381 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-client-ca\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792415 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4372b422-23c7-46bc-aec4-aef665acbda1-serving-cert\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792428 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792444 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-certificates\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792461 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-service-ca\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792480 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2w5p\" (UniqueName: \"kubernetes.io/projected/9c7e70de-de85-421c-aaeb-476450d8e0ee-kube-api-access-h2w5p\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792495 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-etcd-client\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792512 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-trusted-ca-bundle\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792529 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-audit-policies\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792545 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792561 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f726v\" (UniqueName: \"kubernetes.io/projected/1a01ab05-7178-48c7-892b-b91cf60432f8-kube-api-access-f726v\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792578 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv689\" (UniqueName: \"kubernetes.io/projected/4372b422-23c7-46bc-aec4-aef665acbda1-kube-api-access-vv689\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792604 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1a01ab05-7178-48c7-892b-b91cf60432f8-audit-dir\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792631 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvtz8\" (UniqueName: \"kubernetes.io/projected/f4f26c0f-992a-4eb4-86d2-58e42a5b2b68-kube-api-access-tvtz8\") pod \"downloads-7954f5f757-cx59b\" (UID: \"f4f26c0f-992a-4eb4-86d2-58e42a5b2b68\") " pod="openshift-console/downloads-7954f5f757-cx59b" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792647 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-config\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792669 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-trusted-ca\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792690 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-config\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792717 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-oauth-serving-cert\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792734 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq4w8\" (UniqueName: \"kubernetes.io/projected/0ea2e803-34d0-429b-b943-ece0b9e38b63-kube-api-access-gq4w8\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792769 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-serving-cert\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792790 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792807 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt5s8\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-kube-api-access-xt5s8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792822 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9c7e70de-de85-421c-aaeb-476450d8e0ee-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792850 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7e70de-de85-421c-aaeb-476450d8e0ee-serving-cert\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792867 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-bound-sa-token\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792884 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-serving-cert\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792913 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-tls\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: E0313 13:59:49.794775 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:50.294750358 +0000 UTC m=+225.296338617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.833147 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.893562 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.893854 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-serving-cert\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.893886 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d1225410-7280-4409-8934-c6766eae5088-signing-key\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.893916 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-tls\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.893950 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.893965 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894009 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztzgg\" (UniqueName: \"kubernetes.io/projected/d1225410-7280-4409-8934-c6766eae5088-kube-api-access-ztzgg\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894027 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6444bf97-84ef-49df-afcd-4e939a5de2ad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qt7gm\" (UID: \"6444bf97-84ef-49df-afcd-4e939a5de2ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894048 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-client-ca\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894062 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c8b0b1cf-022c-4181-a957-2f7e172a3294-srv-cert\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894092 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22cb0051-a6f4-4790-b51c-3da149327edd-cert\") pod \"ingress-canary-2ps4n\" (UID: \"22cb0051-a6f4-4790-b51c-3da149327edd\") " pod="openshift-ingress-canary/ingress-canary-2ps4n" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894107 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8675f0f0-7d3b-41d9-959e-e73f78f32c5c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x4nxn\" (UID: \"8675f0f0-7d3b-41d9-959e-e73f78f32c5c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894135 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4372b422-23c7-46bc-aec4-aef665acbda1-serving-cert\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894150 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55a96934-e740-402f-b4af-488a7eba53ae-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xglpf\" (UID: \"55a96934-e740-402f-b4af-488a7eba53ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894167 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hckk7\" (UniqueName: \"kubernetes.io/projected/55a96934-e740-402f-b4af-488a7eba53ae-kube-api-access-hckk7\") pod \"multus-admission-controller-857f4d67dd-xglpf\" (UID: \"55a96934-e740-402f-b4af-488a7eba53ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894182 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f52c1025-32e7-4eba-8af4-5c5cce1918da-secret-volume\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894197 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e6be656-c448-4b38-b5a8-2401ab767c54-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894211 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-plugins-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894226 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ebc90f-88a0-476c-98d6-c595517196b3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894242 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-certificates\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894261 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-csi-data-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894295 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-service-ca\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894311 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5gcl\" (UniqueName: \"kubernetes.io/projected/9af6faad-479e-481b-9f66-d074c1c20ce8-kube-api-access-b5gcl\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894326 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ebc90f-88a0-476c-98d6-c595517196b3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894351 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbhjt\" (UniqueName: \"kubernetes.io/projected/2e6be656-c448-4b38-b5a8-2401ab767c54-kube-api-access-qbhjt\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894383 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-trusted-ca-bundle\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894408 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfaa00dc-cff6-47b5-878f-886fab80071b-config\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894426 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894440 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfaa00dc-cff6-47b5-878f-886fab80071b-serving-cert\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894518 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8chc\" (UniqueName: \"kubernetes.io/projected/a0416bba-76e3-4312-94e0-ac5b77c6ace0-kube-api-access-j8chc\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894534 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-mountpoint-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894567 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-registration-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894581 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk2f8\" (UniqueName: \"kubernetes.io/projected/dfaa00dc-cff6-47b5-878f-886fab80071b-kube-api-access-fk2f8\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894597 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d27543e-df10-41f7-be85-dfe319aaec8a-profile-collector-cert\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894610 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-socket-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894627 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d1df7055-9dee-4cde-a787-bc18a276b777-node-bootstrap-token\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894642 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-apiservice-cert\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894658 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-config\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894701 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-config\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894716 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-webhook-cert\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894793 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq4w8\" (UniqueName: \"kubernetes.io/projected/0ea2e803-34d0-429b-b943-ece0b9e38b63-kube-api-access-gq4w8\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894857 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894924 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt5s8\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-kube-api-access-xt5s8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894942 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zr5q\" (UniqueName: \"kubernetes.io/projected/f794406f-fc28-4e2f-953d-ab45e36cc754-kube-api-access-9zr5q\") pod \"migrator-59844c95c7-c6rz7\" (UID: \"f794406f-fc28-4e2f-953d-ab45e36cc754\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894955 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d27543e-df10-41f7-be85-dfe319aaec8a-srv-cert\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894997 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7e70de-de85-421c-aaeb-476450d8e0ee-serving-cert\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895038 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-bound-sa-token\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895053 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895070 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895086 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66ebc90f-88a0-476c-98d6-c595517196b3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895101 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4sls\" (UniqueName: \"kubernetes.io/projected/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-kube-api-access-f4sls\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895136 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895152 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrxwz\" (UniqueName: \"kubernetes.io/projected/7d27543e-df10-41f7-be85-dfe319aaec8a-kube-api-access-nrxwz\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895168 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-oauth-config\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895184 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9dnr\" (UniqueName: \"kubernetes.io/projected/d1df7055-9dee-4cde-a787-bc18a276b777-kube-api-access-k9dnr\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895200 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9af6faad-479e-481b-9f66-d074c1c20ce8-images\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895234 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbnpf\" (UniqueName: \"kubernetes.io/projected/0a78868f-1786-430d-8df8-18bb1c2019b3-kube-api-access-rbnpf\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895248 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-config\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895262 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqqxt\" (UniqueName: \"kubernetes.io/projected/c8b0b1cf-022c-4181-a957-2f7e172a3294-kube-api-access-cqqxt\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895278 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-encryption-config\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895312 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895328 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad924960-c3fd-4412-9b39-0723a598d86d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895396 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9zcr\" (UniqueName: \"kubernetes.io/projected/8675f0f0-7d3b-41d9-959e-e73f78f32c5c-kube-api-access-t9zcr\") pod \"package-server-manager-789f6589d5-x4nxn\" (UID: \"8675f0f0-7d3b-41d9-959e-e73f78f32c5c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895419 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9af6faad-479e-481b-9f66-d074c1c20ce8-proxy-tls\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895440 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv52f\" (UniqueName: \"kubernetes.io/projected/6444bf97-84ef-49df-afcd-4e939a5de2ad-kube-api-access-nv52f\") pod \"control-plane-machine-set-operator-78cbb6b69f-qt7gm\" (UID: \"6444bf97-84ef-49df-afcd-4e939a5de2ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895489 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2w5p\" (UniqueName: \"kubernetes.io/projected/9c7e70de-de85-421c-aaeb-476450d8e0ee-kube-api-access-h2w5p\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895506 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f52c1025-32e7-4eba-8af4-5c5cce1918da-config-volume\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897404 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-config\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897483 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-etcd-client\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897532 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-285rt\" (UniqueName: \"kubernetes.io/projected/22cb0051-a6f4-4790-b51c-3da149327edd-kube-api-access-285rt\") pod \"ingress-canary-2ps4n\" (UID: \"22cb0051-a6f4-4790-b51c-3da149327edd\") " pod="openshift-ingress-canary/ingress-canary-2ps4n" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897556 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897591 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq68p\" (UniqueName: \"kubernetes.io/projected/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-kube-api-access-dq68p\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897627 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-audit-policies\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897651 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e6be656-c448-4b38-b5a8-2401ab767c54-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897673 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c8b0b1cf-022c-4181-a957-2f7e172a3294-profile-collector-cert\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897689 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9969w\" (UniqueName: \"kubernetes.io/projected/41000ce4-1a84-44de-b283-1fe0350b1c17-kube-api-access-9969w\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897733 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f726v\" (UniqueName: \"kubernetes.io/projected/1a01ab05-7178-48c7-892b-b91cf60432f8-kube-api-access-f726v\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897753 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a0416bba-76e3-4312-94e0-ac5b77c6ace0-metrics-tls\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897796 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv689\" (UniqueName: \"kubernetes.io/projected/4372b422-23c7-46bc-aec4-aef665acbda1-kube-api-access-vv689\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897812 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1a01ab05-7178-48c7-892b-b91cf60432f8-audit-dir\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897861 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d1225410-7280-4409-8934-c6766eae5088-signing-cabundle\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897912 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvtz8\" (UniqueName: \"kubernetes.io/projected/f4f26c0f-992a-4eb4-86d2-58e42a5b2b68-kube-api-access-tvtz8\") pod \"downloads-7954f5f757-cx59b\" (UID: \"f4f26c0f-992a-4eb4-86d2-58e42a5b2b68\") " pod="openshift-console/downloads-7954f5f757-cx59b" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897931 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0416bba-76e3-4312-94e0-ac5b77c6ace0-config-volume\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897985 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-trusted-ca\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898003 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad924960-c3fd-4412-9b39-0723a598d86d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898060 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad924960-c3fd-4412-9b39-0723a598d86d-config\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898087 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-tmpfs\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898203 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-oauth-serving-cert\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: E0313 13:59:49.898253 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:50.398235637 +0000 UTC m=+225.399823866 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898308 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898442 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqn5q\" (UniqueName: \"kubernetes.io/projected/f52c1025-32e7-4eba-8af4-5c5cce1918da-kube-api-access-bqn5q\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898474 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7p5d\" (UniqueName: \"kubernetes.io/projected/aa1ed4c8-e4bd-4352-bee3-404f16244ea3-kube-api-access-c7p5d\") pod \"auto-csr-approver-29556838-h7pkr\" (UID: \"aa1ed4c8-e4bd-4352-bee3-404f16244ea3\") " pod="openshift-infra/auto-csr-approver-29556838-h7pkr" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898743 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-serving-cert\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898778 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9c7e70de-de85-421c-aaeb-476450d8e0ee-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898827 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-proxy-tls\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898859 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d1df7055-9dee-4cde-a787-bc18a276b777-certs\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898916 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9af6faad-479e-481b-9f66-d074c1c20ce8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.900161 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9c7e70de-de85-421c-aaeb-476450d8e0ee-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.900274 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-oauth-serving-cert\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.900998 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-config\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.901070 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.902851 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-trusted-ca\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.903196 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1a01ab05-7178-48c7-892b-b91cf60432f8-audit-dir\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.903214 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.903372 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-audit-policies\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.904219 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.904293 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-trusted-ca-bundle\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.905047 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.905517 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-certificates\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.905646 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-client-ca\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.905781 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-service-ca\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.905867 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-serving-cert\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.906064 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7e70de-de85-421c-aaeb-476450d8e0ee-serving-cert\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.906730 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r"] Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.911648 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-encryption-config\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.911818 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-tls\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.911875 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.913047 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-oauth-config\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.914298 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-serving-cert\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.916352 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4372b422-23c7-46bc-aec4-aef665acbda1-serving-cert\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.917661 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-etcd-client\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.922519 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6plhg" event={"ID":"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8","Type":"ContainerStarted","Data":"c62d0b0db8a023eca0369719b9dd81ab2eadb339e637fa7223f0ac36593bb07b"} Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.922566 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6plhg" event={"ID":"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8","Type":"ContainerStarted","Data":"07d462c95de220b26b61c71fe04f7db640c2a0648bbcb012450f83b22cbe5d4a"} Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.926558 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" event={"ID":"ade74420-c7a1-4b89-b6c8-7970d7b6c17c","Type":"ContainerStarted","Data":"a90eb9072360ab6ffcc9dd0976c83ee1c38d5248c10036f6319955fcd85b0714"} Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.926588 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" event={"ID":"ade74420-c7a1-4b89-b6c8-7970d7b6c17c","Type":"ContainerStarted","Data":"c3bb23c34123cbab89cddad63cb25b898b7ddce6417f31ab796a400379737021"} Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.927534 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2w5p\" (UniqueName: \"kubernetes.io/projected/9c7e70de-de85-421c-aaeb-476450d8e0ee-kube-api-access-h2w5p\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.929345 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" event={"ID":"096d3786-85e8-4fe5-82b3-57cd1be251a1","Type":"ContainerStarted","Data":"ddb16b36ce9e24c86f9837e204782820857278d7a7741a6518de23e51d82b48d"} Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.931462 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" event={"ID":"6f12557e-02f5-4445-988f-b19f16672e3b","Type":"ContainerStarted","Data":"10a096fb4e024f13b0375adff8f9af56e97cc2b0078dae71d54e42f6db24f3c3"} Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.933879 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.957326 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-bound-sa-token\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.979227 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x"] Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.987705 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f726v\" (UniqueName: \"kubernetes.io/projected/1a01ab05-7178-48c7-892b-b91cf60432f8-kube-api-access-f726v\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000557 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztzgg\" (UniqueName: \"kubernetes.io/projected/d1225410-7280-4409-8934-c6766eae5088-kube-api-access-ztzgg\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000601 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6444bf97-84ef-49df-afcd-4e939a5de2ad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qt7gm\" (UID: \"6444bf97-84ef-49df-afcd-4e939a5de2ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000627 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c8b0b1cf-022c-4181-a957-2f7e172a3294-srv-cert\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000649 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22cb0051-a6f4-4790-b51c-3da149327edd-cert\") pod \"ingress-canary-2ps4n\" (UID: \"22cb0051-a6f4-4790-b51c-3da149327edd\") " pod="openshift-ingress-canary/ingress-canary-2ps4n" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000670 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8675f0f0-7d3b-41d9-959e-e73f78f32c5c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x4nxn\" (UID: \"8675f0f0-7d3b-41d9-959e-e73f78f32c5c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000695 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55a96934-e740-402f-b4af-488a7eba53ae-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xglpf\" (UID: \"55a96934-e740-402f-b4af-488a7eba53ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000720 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hckk7\" (UniqueName: \"kubernetes.io/projected/55a96934-e740-402f-b4af-488a7eba53ae-kube-api-access-hckk7\") pod \"multus-admission-controller-857f4d67dd-xglpf\" (UID: \"55a96934-e740-402f-b4af-488a7eba53ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000740 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f52c1025-32e7-4eba-8af4-5c5cce1918da-secret-volume\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000759 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ebc90f-88a0-476c-98d6-c595517196b3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000780 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e6be656-c448-4b38-b5a8-2401ab767c54-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000802 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-plugins-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000824 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-csi-data-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000847 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5gcl\" (UniqueName: \"kubernetes.io/projected/9af6faad-479e-481b-9f66-d074c1c20ce8-kube-api-access-b5gcl\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000866 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ebc90f-88a0-476c-98d6-c595517196b3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000888 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbhjt\" (UniqueName: \"kubernetes.io/projected/2e6be656-c448-4b38-b5a8-2401ab767c54-kube-api-access-qbhjt\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000974 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfaa00dc-cff6-47b5-878f-886fab80071b-config\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000996 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfaa00dc-cff6-47b5-878f-886fab80071b-serving-cert\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001321 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-plugins-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001758 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8chc\" (UniqueName: \"kubernetes.io/projected/a0416bba-76e3-4312-94e0-ac5b77c6ace0-kube-api-access-j8chc\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001795 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-mountpoint-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001824 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-registration-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001848 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk2f8\" (UniqueName: \"kubernetes.io/projected/dfaa00dc-cff6-47b5-878f-886fab80071b-kube-api-access-fk2f8\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001871 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d27543e-df10-41f7-be85-dfe319aaec8a-profile-collector-cert\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001892 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-socket-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001959 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d1df7055-9dee-4cde-a787-bc18a276b777-node-bootstrap-token\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001981 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-apiservice-cert\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002006 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-webhook-cert\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002052 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002083 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002112 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zr5q\" (UniqueName: \"kubernetes.io/projected/f794406f-fc28-4e2f-953d-ab45e36cc754-kube-api-access-9zr5q\") pod \"migrator-59844c95c7-c6rz7\" (UID: \"f794406f-fc28-4e2f-953d-ab45e36cc754\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002137 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d27543e-df10-41f7-be85-dfe319aaec8a-srv-cert\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002200 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002230 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002248 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66ebc90f-88a0-476c-98d6-c595517196b3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002266 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4sls\" (UniqueName: \"kubernetes.io/projected/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-kube-api-access-f4sls\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002283 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrxwz\" (UniqueName: \"kubernetes.io/projected/7d27543e-df10-41f7-be85-dfe319aaec8a-kube-api-access-nrxwz\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002300 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9dnr\" (UniqueName: \"kubernetes.io/projected/d1df7055-9dee-4cde-a787-bc18a276b777-kube-api-access-k9dnr\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002318 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9af6faad-479e-481b-9f66-d074c1c20ce8-images\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002337 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbnpf\" (UniqueName: \"kubernetes.io/projected/0a78868f-1786-430d-8df8-18bb1c2019b3-kube-api-access-rbnpf\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002353 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-config\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002371 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqqxt\" (UniqueName: \"kubernetes.io/projected/c8b0b1cf-022c-4181-a957-2f7e172a3294-kube-api-access-cqqxt\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002398 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad924960-c3fd-4412-9b39-0723a598d86d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002423 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9zcr\" (UniqueName: \"kubernetes.io/projected/8675f0f0-7d3b-41d9-959e-e73f78f32c5c-kube-api-access-t9zcr\") pod \"package-server-manager-789f6589d5-x4nxn\" (UID: \"8675f0f0-7d3b-41d9-959e-e73f78f32c5c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002594 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9af6faad-479e-481b-9f66-d074c1c20ce8-proxy-tls\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002612 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv52f\" (UniqueName: \"kubernetes.io/projected/6444bf97-84ef-49df-afcd-4e939a5de2ad-kube-api-access-nv52f\") pod \"control-plane-machine-set-operator-78cbb6b69f-qt7gm\" (UID: \"6444bf97-84ef-49df-afcd-4e939a5de2ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002634 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f52c1025-32e7-4eba-8af4-5c5cce1918da-config-volume\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002658 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-285rt\" (UniqueName: \"kubernetes.io/projected/22cb0051-a6f4-4790-b51c-3da149327edd-kube-api-access-285rt\") pod \"ingress-canary-2ps4n\" (UID: \"22cb0051-a6f4-4790-b51c-3da149327edd\") " pod="openshift-ingress-canary/ingress-canary-2ps4n" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002678 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002703 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq68p\" (UniqueName: \"kubernetes.io/projected/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-kube-api-access-dq68p\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002746 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e6be656-c448-4b38-b5a8-2401ab767c54-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002770 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c8b0b1cf-022c-4181-a957-2f7e172a3294-profile-collector-cert\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002792 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9969w\" (UniqueName: \"kubernetes.io/projected/41000ce4-1a84-44de-b283-1fe0350b1c17-kube-api-access-9969w\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002820 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a0416bba-76e3-4312-94e0-ac5b77c6ace0-metrics-tls\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002865 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d1225410-7280-4409-8934-c6766eae5088-signing-cabundle\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002929 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0416bba-76e3-4312-94e0-ac5b77c6ace0-config-volume\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002957 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad924960-c3fd-4412-9b39-0723a598d86d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002981 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad924960-c3fd-4412-9b39-0723a598d86d-config\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.003002 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-tmpfs\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.003038 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.003065 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqn5q\" (UniqueName: \"kubernetes.io/projected/f52c1025-32e7-4eba-8af4-5c5cce1918da-kube-api-access-bqn5q\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.003117 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7p5d\" (UniqueName: \"kubernetes.io/projected/aa1ed4c8-e4bd-4352-bee3-404f16244ea3-kube-api-access-c7p5d\") pod \"auto-csr-approver-29556838-h7pkr\" (UID: \"aa1ed4c8-e4bd-4352-bee3-404f16244ea3\") " pod="openshift-infra/auto-csr-approver-29556838-h7pkr" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.003158 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-proxy-tls\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.003184 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d1df7055-9dee-4cde-a787-bc18a276b777-certs\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.003206 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9af6faad-479e-481b-9f66-d074c1c20ce8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.003229 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d1225410-7280-4409-8934-c6766eae5088-signing-key\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.005070 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-socket-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.005146 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfaa00dc-cff6-47b5-878f-886fab80071b-config\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.005230 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-csi-data-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.005762 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ebc90f-88a0-476c-98d6-c595517196b3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.008232 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e6be656-c448-4b38-b5a8-2401ab767c54-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.008837 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f52c1025-32e7-4eba-8af4-5c5cce1918da-config-volume\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.010401 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0416bba-76e3-4312-94e0-ac5b77c6ace0-config-volume\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.010438 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-mountpoint-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.010489 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-registration-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.011028 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9af6faad-479e-481b-9f66-d074c1c20ce8-images\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.015061 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-webhook-cert\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.012221 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-config\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.012474 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:50.512458202 +0000 UTC m=+225.514046441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.012510 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ebc90f-88a0-476c-98d6-c595517196b3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.013118 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.013242 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9af6faad-479e-481b-9f66-d074c1c20ce8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.013266 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-tmpfs\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.014400 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad924960-c3fd-4412-9b39-0723a598d86d-config\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.011195 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.015678 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d1225410-7280-4409-8934-c6766eae5088-signing-cabundle\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.017702 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d1df7055-9dee-4cde-a787-bc18a276b777-node-bootstrap-token\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.018042 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a0416bba-76e3-4312-94e0-ac5b77c6ace0-metrics-tls\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.018420 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d27543e-df10-41f7-be85-dfe319aaec8a-srv-cert\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.018813 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9af6faad-479e-481b-9f66-d074c1c20ce8-proxy-tls\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.019384 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d1df7055-9dee-4cde-a787-bc18a276b777-certs\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.019637 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d27543e-df10-41f7-be85-dfe319aaec8a-profile-collector-cert\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.020362 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55a96934-e740-402f-b4af-488a7eba53ae-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xglpf\" (UID: \"55a96934-e740-402f-b4af-488a7eba53ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.020462 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e6be656-c448-4b38-b5a8-2401ab767c54-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.020555 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f52c1025-32e7-4eba-8af4-5c5cce1918da-secret-volume\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.020645 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c8b0b1cf-022c-4181-a957-2f7e172a3294-srv-cert\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.020841 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d1225410-7280-4409-8934-c6766eae5088-signing-key\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.021519 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.022635 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvtz8\" (UniqueName: \"kubernetes.io/projected/f4f26c0f-992a-4eb4-86d2-58e42a5b2b68-kube-api-access-tvtz8\") pod \"downloads-7954f5f757-cx59b\" (UID: \"f4f26c0f-992a-4eb4-86d2-58e42a5b2b68\") " pod="openshift-console/downloads-7954f5f757-cx59b" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.023739 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-proxy-tls\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.023835 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.024082 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22cb0051-a6f4-4790-b51c-3da149327edd-cert\") pod \"ingress-canary-2ps4n\" (UID: \"22cb0051-a6f4-4790-b51c-3da149327edd\") " pod="openshift-ingress-canary/ingress-canary-2ps4n" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.024150 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8675f0f0-7d3b-41d9-959e-e73f78f32c5c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x4nxn\" (UID: \"8675f0f0-7d3b-41d9-959e-e73f78f32c5c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.024824 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfaa00dc-cff6-47b5-878f-886fab80071b-serving-cert\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.025034 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c8b0b1cf-022c-4181-a957-2f7e172a3294-profile-collector-cert\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.035673 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6444bf97-84ef-49df-afcd-4e939a5de2ad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qt7gm\" (UID: \"6444bf97-84ef-49df-afcd-4e939a5de2ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.035990 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad924960-c3fd-4412-9b39-0723a598d86d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.036383 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-apiservice-cert\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.036529 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv689\" (UniqueName: \"kubernetes.io/projected/4372b422-23c7-46bc-aec4-aef665acbda1-kube-api-access-vv689\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.046733 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq4w8\" (UniqueName: \"kubernetes.io/projected/0ea2e803-34d0-429b-b943-ece0b9e38b63-kube-api-access-gq4w8\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.085526 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt5s8\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-kube-api-access-xt5s8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.098865 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-cx59b" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.106816 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.106984 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:50.606955626 +0000 UTC m=+225.608543865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.107152 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.107477 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:50.607470759 +0000 UTC m=+225.609058998 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.124022 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztzgg\" (UniqueName: \"kubernetes.io/projected/d1225410-7280-4409-8934-c6766eae5088-kube-api-access-ztzgg\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.135226 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv52f\" (UniqueName: \"kubernetes.io/projected/6444bf97-84ef-49df-afcd-4e939a5de2ad-kube-api-access-nv52f\") pod \"control-plane-machine-set-operator-78cbb6b69f-qt7gm\" (UID: \"6444bf97-84ef-49df-afcd-4e939a5de2ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.138839 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.141540 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.143672 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kgnxj"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.145499 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5gcl\" (UniqueName: \"kubernetes.io/projected/9af6faad-479e-481b-9f66-d074c1c20ce8-kube-api-access-b5gcl\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.151322 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:50 crc kubenswrapper[4898]: W0313 13:59:50.157972 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeedd2260_f339_4e2f_83e8_13a56cee2ce6.slice/crio-700c6042be9e525e8749e066e5d1abd797663a173007f1872c37d99868a2e4ae WatchSource:0}: Error finding container 700c6042be9e525e8749e066e5d1abd797663a173007f1872c37d99868a2e4ae: Status 404 returned error can't find the container with id 700c6042be9e525e8749e066e5d1abd797663a173007f1872c37d99868a2e4ae Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.168594 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbhjt\" (UniqueName: \"kubernetes.io/projected/2e6be656-c448-4b38-b5a8-2401ab767c54-kube-api-access-qbhjt\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.188883 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hckk7\" (UniqueName: \"kubernetes.io/projected/55a96934-e740-402f-b4af-488a7eba53ae-kube-api-access-hckk7\") pod \"multus-admission-controller-857f4d67dd-xglpf\" (UID: \"55a96934-e740-402f-b4af-488a7eba53ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.189135 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.199525 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.215165 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.216312 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:50.716294605 +0000 UTC m=+225.717882834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.233308 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8chc\" (UniqueName: \"kubernetes.io/projected/a0416bba-76e3-4312-94e0-ac5b77c6ace0-kube-api-access-j8chc\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.237308 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t2s2h"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.239861 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqn5q\" (UniqueName: \"kubernetes.io/projected/f52c1025-32e7-4eba-8af4-5c5cce1918da-kube-api-access-bqn5q\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.249987 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4sls\" (UniqueName: \"kubernetes.io/projected/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-kube-api-access-f4sls\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.256050 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.258494 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.272539 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrxwz\" (UniqueName: \"kubernetes.io/projected/7d27543e-df10-41f7-be85-dfe319aaec8a-kube-api-access-nrxwz\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.288614 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-285rt\" (UniqueName: \"kubernetes.io/projected/22cb0051-a6f4-4790-b51c-3da149327edd-kube-api-access-285rt\") pod \"ingress-canary-2ps4n\" (UID: \"22cb0051-a6f4-4790-b51c-3da149327edd\") " pod="openshift-ingress-canary/ingress-canary-2ps4n" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.292314 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.303939 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.315313 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7p5d\" (UniqueName: \"kubernetes.io/projected/aa1ed4c8-e4bd-4352-bee3-404f16244ea3-kube-api-access-c7p5d\") pod \"auto-csr-approver-29556838-h7pkr\" (UID: \"aa1ed4c8-e4bd-4352-bee3-404f16244ea3\") " pod="openshift-infra/auto-csr-approver-29556838-h7pkr" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.317117 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.317413 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:50.817401847 +0000 UTC m=+225.818990086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.333174 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq68p\" (UniqueName: \"kubernetes.io/projected/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-kube-api-access-dq68p\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.342922 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.347608 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9dnr\" (UniqueName: \"kubernetes.io/projected/d1df7055-9dee-4cde-a787-bc18a276b777-kube-api-access-k9dnr\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.359113 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.367597 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.374949 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk2f8\" (UniqueName: \"kubernetes.io/projected/dfaa00dc-cff6-47b5-878f-886fab80071b-kube-api-access-fk2f8\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.376645 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.392315 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.398172 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbnpf\" (UniqueName: \"kubernetes.io/projected/0a78868f-1786-430d-8df8-18bb1c2019b3-kube-api-access-rbnpf\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.398566 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-m2ntx"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.399510 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-cx59b"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.402322 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.408298 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z5vf2"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.412840 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqqxt\" (UniqueName: \"kubernetes.io/projected/c8b0b1cf-022c-4181-a957-2f7e172a3294-kube-api-access-cqqxt\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.417624 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-djn5q"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.417993 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.418383 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:50.918365466 +0000 UTC m=+225.919953705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.428101 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad924960-c3fd-4412-9b39-0723a598d86d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.432014 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.432438 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.446731 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.453724 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9zcr\" (UniqueName: \"kubernetes.io/projected/8675f0f0-7d3b-41d9-959e-e73f78f32c5c-kube-api-access-t9zcr\") pod \"package-server-manager-789f6589d5-x4nxn\" (UID: \"8675f0f0-7d3b-41d9-959e-e73f78f32c5c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.454705 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.466060 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zr5q\" (UniqueName: \"kubernetes.io/projected/f794406f-fc28-4e2f-953d-ab45e36cc754-kube-api-access-9zr5q\") pod \"migrator-59844c95c7-c6rz7\" (UID: \"f794406f-fc28-4e2f-953d-ab45e36cc754\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.477544 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.486466 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.487487 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9969w\" (UniqueName: \"kubernetes.io/projected/41000ce4-1a84-44de-b283-1fe0350b1c17-kube-api-access-9969w\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.491359 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.507670 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.515279 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.515521 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.520253 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.520595 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.020580695 +0000 UTC m=+226.022168934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.521721 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2ps4n" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.529744 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66ebc90f-88a0-476c-98d6-c595517196b3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.530294 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2"] Mar 13 13:59:50 crc kubenswrapper[4898]: W0313 13:59:50.570275 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c7e70de_de85_421c_aaeb_476450d8e0ee.slice/crio-c742ce810138484734e421760d4b718ccb00b7e51da4d364477ac71eaf59de83 WatchSource:0}: Error finding container c742ce810138484734e421760d4b718ccb00b7e51da4d364477ac71eaf59de83: Status 404 returned error can't find the container with id c742ce810138484734e421760d4b718ccb00b7e51da4d364477ac71eaf59de83 Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.593483 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pvbpt"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.611498 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.619060 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:50 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:50 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:50 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.619130 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.621581 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.622174 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.122153388 +0000 UTC m=+226.123741627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.627036 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.636291 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7l2pm"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.642087 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.666157 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.677306 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z5r8j"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.683910 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.711541 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.717445 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.717489 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.723304 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.727579 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.227545372 +0000 UTC m=+226.229133611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: W0313 13:59:50.742890 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4372b422_23c7_46bc_aec4_aef665acbda1.slice/crio-ee22c587d8deac6e60488ead6927720218bbd5dcec6d2870cf91f10d1559c75e WatchSource:0}: Error finding container ee22c587d8deac6e60488ead6927720218bbd5dcec6d2870cf91f10d1559c75e: Status 404 returned error can't find the container with id ee22c587d8deac6e60488ead6927720218bbd5dcec6d2870cf91f10d1559c75e Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.822199 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.829422 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.829549 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.329531246 +0000 UTC m=+226.331119485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.829715 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.830024 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.330013007 +0000 UTC m=+226.331601246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.872563 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.930274 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.930314 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.930438 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.430415742 +0000 UTC m=+226.432003981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.930607 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.931040 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.431024887 +0000 UTC m=+226.432613126 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.959920 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" event={"ID":"6444bf97-84ef-49df-afcd-4e939a5de2ad","Type":"ContainerStarted","Data":"85585555f05f37157815eee5486f1b53c522c569866b3a4a908126f95eccb25f"} Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.970716 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.972656 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" event={"ID":"cfd8810f-79f1-4634-9e4d-245348fba016","Type":"ContainerStarted","Data":"38c5e00ca050df6338a3ac23b3ae7f41a44bcf7f34981305589959f66bacb4e1"} Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.972710 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" event={"ID":"cfd8810f-79f1-4634-9e4d-245348fba016","Type":"ContainerStarted","Data":"9f2f67c90828e74895837ab966c82bca169abd9d449ed4e19462c03b31540ac5"} Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.974617 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" event={"ID":"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8","Type":"ContainerStarted","Data":"40db5ead3f35fd7e9f31497cd14e9da8aacbec45b14d38854af9198357e129f3"} Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.974645 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" event={"ID":"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8","Type":"ContainerStarted","Data":"54669a4eb32ce5f819f2ab3e41eeb9e584dc1c1637b7ae1b674adabfa2f3697e"} Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.995183 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd"] Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.001065 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" event={"ID":"a402522c-e891-477d-a2cc-5aa7c6944e06","Type":"ContainerStarted","Data":"b71d37a9b192a568417ada3e35a3e4ffd2df69d4e7eb13179c9c41cb0c662f84"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.001102 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" event={"ID":"a402522c-e891-477d-a2cc-5aa7c6944e06","Type":"ContainerStarted","Data":"8c08024d3c0de35ea9f364b67796b99f75c6db15f5a6f92c9b72d29958091ed3"} Mar 13 13:59:51 crc kubenswrapper[4898]: W0313 13:59:51.001872 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8d8e11d_3717_47fd_a5c6_b8f52f19147b.slice/crio-4e827400971be56892c4197993ef9f19ebab3fc7960d56bb3ff4253587e132b4 WatchSource:0}: Error finding container 4e827400971be56892c4197993ef9f19ebab3fc7960d56bb3ff4253587e132b4: Status 404 returned error can't find the container with id 4e827400971be56892c4197993ef9f19ebab3fc7960d56bb3ff4253587e132b4 Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.004911 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" event={"ID":"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6","Type":"ContainerStarted","Data":"f1c032477911f00a8677de891818a57f20a171c3f57edeaa089ea6f30ea56258"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.004945 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" event={"ID":"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6","Type":"ContainerStarted","Data":"8d6f6a844f1f050f625f8cbe23c894e4c729aaad2381b89fd751edb96ef80439"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.019668 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" event={"ID":"1607f924-1e24-4848-b811-21ac3a7f8999","Type":"ContainerStarted","Data":"4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.019705 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" event={"ID":"1607f924-1e24-4848-b811-21ac3a7f8999","Type":"ContainerStarted","Data":"ea436643cbf70a5d9613211ff3822168c4bccbc29674c27f320c9a9a2e6fcdbf"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.020148 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.025010 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" event={"ID":"1a01ab05-7178-48c7-892b-b91cf60432f8","Type":"ContainerStarted","Data":"73126ce78a1c88f42bd877eae252c360d01ab5e1603e33c3d7b203df67250a9e"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.031509 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.031926 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.531906174 +0000 UTC m=+226.533494403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.034746 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" event={"ID":"eedd2260-f339-4e2f-83e8-13a56cee2ce6","Type":"ContainerStarted","Data":"3b8a8b92fcb6d2069c22b60f0771b1cd2665489593ac838d1c6a3929d94d05cf"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.034782 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" event={"ID":"eedd2260-f339-4e2f-83e8-13a56cee2ce6","Type":"ContainerStarted","Data":"700c6042be9e525e8749e066e5d1abd797663a173007f1872c37d99868a2e4ae"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.035115 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.038471 4898 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-gwvk4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.038516 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" podUID="1607f924-1e24-4848-b811-21ac3a7f8999" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.038936 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" event={"ID":"22f99dde-8f14-4e43-af7d-fe6e5ec2a908","Type":"ContainerStarted","Data":"25cc4ffc6f33a805f22b0eea26475a9975c83c06bae94a8b4f47b778d0824cd3"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.038969 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" event={"ID":"22f99dde-8f14-4e43-af7d-fe6e5ec2a908","Type":"ContainerStarted","Data":"c952792a9aea146bfa3cac8b34ea6e3871545424014dd47e14dc032521c2bd37"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.042659 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" event={"ID":"b26a4d77-f170-467e-ad96-4741cc5a8f23","Type":"ContainerStarted","Data":"2d5714977afe363a0af3e9631742fe59289a173d68823730a2b889e9e03736c1"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.048971 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" event={"ID":"4372b422-23c7-46bc-aec4-aef665acbda1","Type":"ContainerStarted","Data":"ee22c587d8deac6e60488ead6927720218bbd5dcec6d2870cf91f10d1559c75e"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.050165 4898 patch_prober.go:28] interesting pod/console-operator-58897d9998-kgnxj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.050227 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" podUID="eedd2260-f339-4e2f-83e8-13a56cee2ce6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 13 13:59:51 crc kubenswrapper[4898]: W0313 13:59:51.062379 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8b0b1cf_022c_4181_a957_2f7e172a3294.slice/crio-34583260d4b9f0b9859c8abe64e8c439dab719ff163e6f9eeb1d0488f02e5215 WatchSource:0}: Error finding container 34583260d4b9f0b9859c8abe64e8c439dab719ff163e6f9eeb1d0488f02e5215: Status 404 returned error can't find the container with id 34583260d4b9f0b9859c8abe64e8c439dab719ff163e6f9eeb1d0488f02e5215 Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.074314 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" event={"ID":"071d8651-2a2d-4eed-9023-cfe636be09a0","Type":"ContainerStarted","Data":"e37479bda13beaf24311527eafecb516274243b704d6fb5081a97097ad473802"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.078451 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw"] Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.103054 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" event={"ID":"096d3786-85e8-4fe5-82b3-57cd1be251a1","Type":"ContainerStarted","Data":"4b483eb3df207667ce0ef08ecc9d5f04b00b1a650240b528c865cd0dd84c46c9"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.103092 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" event={"ID":"096d3786-85e8-4fe5-82b3-57cd1be251a1","Type":"ContainerStarted","Data":"126fe2722619d1ccfd5aab732a8d0103c0af471c906f8945e4a069432feb1124"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.126287 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" event={"ID":"d1225410-7280-4409-8934-c6766eae5088","Type":"ContainerStarted","Data":"33593c468815b9c8e9f15ac21cf80464c097a31540fa561b5bb4e34d82d97c79"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.135101 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.136843 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.636828937 +0000 UTC m=+226.638417176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.138657 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44062: no serving certificate available for the kubelet" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.149127 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-cx59b" event={"ID":"f4f26c0f-992a-4eb4-86d2-58e42a5b2b68","Type":"ContainerStarted","Data":"4ab7d6b85e3099e796ecabfa9fc2660711ba6296b5f8685afad4dadd2df7e123"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.154669 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" event={"ID":"ade74420-c7a1-4b89-b6c8-7970d7b6c17c","Type":"ContainerStarted","Data":"ffd655709f699fd50969931ddb740298c0099a8571dae80aacbc1b11abed3487"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.163861 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" event={"ID":"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8","Type":"ContainerStarted","Data":"c5207dc2392a3957a9d88bb0f82c2a6de94b77565a714483cee8bf8ce748578b"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.163921 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" event={"ID":"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8","Type":"ContainerStarted","Data":"fef57ee8d21a36c6a5d2018e3226c94863ce917c74665f33796c13791d23df10"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.163936 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" event={"ID":"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8","Type":"ContainerStarted","Data":"6bbe0d469a3994ecb4a46add078ab4d0aa025076f703ba20434e200fc6c06c81"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.184124 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7l2pm" event={"ID":"0ea2e803-34d0-429b-b943-ece0b9e38b63","Type":"ContainerStarted","Data":"f80f9d0a69e3b6c8de8df5e105815c2ea6a5c4fed2a8e106511494e31c10c8bf"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.186268 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" event={"ID":"9c7e70de-de85-421c-aaeb-476450d8e0ee","Type":"ContainerStarted","Data":"c742ce810138484734e421760d4b718ccb00b7e51da4d364477ac71eaf59de83"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.190373 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" event={"ID":"f446713d-03e3-461f-989f-eb6bdef32b30","Type":"ContainerStarted","Data":"3243f08e88e10cd81ef2c04e13602c3855a5f11779515760120bd35a4d40801a"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.197235 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" event={"ID":"6f12557e-02f5-4445-988f-b19f16672e3b","Type":"ContainerStarted","Data":"04b36de235cccfc39b10d95f29d1b8eb5397d41b37472245250afa44100523ba"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.227691 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j"] Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.234965 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44074: no serving certificate available for the kubelet" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.238615 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.239554 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.739535588 +0000 UTC m=+226.741123827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.335313 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44082: no serving certificate available for the kubelet" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.349718 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.350875 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.850843922 +0000 UTC m=+226.852432221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.430534 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44086: no serving certificate available for the kubelet" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.454371 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.454818 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.954802622 +0000 UTC m=+226.956390851 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.534238 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44092: no serving certificate available for the kubelet" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.556775 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.557170 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.057158504 +0000 UTC m=+227.058746733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.630495 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:51 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:51 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:51 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.630860 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.686503 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.686882 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.186865239 +0000 UTC m=+227.188453478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.689649 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44096: no serving certificate available for the kubelet" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.749974 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44106: no serving certificate available for the kubelet" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.789586 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.790138 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.290110662 +0000 UTC m=+227.291698901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.894846 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.895199 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.395182399 +0000 UTC m=+227.396770638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.916009 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44112: no serving certificate available for the kubelet" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.920379 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xglpf"] Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.944265 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cvbms"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:51.997689 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:51.997972 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.497961501 +0000 UTC m=+227.499549740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.002602 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rd22p"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.045998 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.086726 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556838-h7pkr"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.100739 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.100994 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.600968809 +0000 UTC m=+227.602557048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.149913 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.169242 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.179782 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hqcs6"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.195402 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" podStartSLOduration=161.195379261 podStartE2EDuration="2m41.195379261s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.173072169 +0000 UTC m=+227.174660408" watchObservedRunningTime="2026-03-13 13:59:52.195379261 +0000 UTC m=+227.196967500" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.208385 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.208795 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.708782111 +0000 UTC m=+227.710370350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.209312 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" podStartSLOduration=161.209297543 podStartE2EDuration="2m41.209297543s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.208105045 +0000 UTC m=+227.209693294" watchObservedRunningTime="2026-03-13 13:59:52.209297543 +0000 UTC m=+227.210885782" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.219642 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" event={"ID":"c8b0b1cf-022c-4181-a957-2f7e172a3294","Type":"ContainerStarted","Data":"34583260d4b9f0b9859c8abe64e8c439dab719ff163e6f9eeb1d0488f02e5215"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.239414 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" event={"ID":"7667c5a1-aecb-4ccd-b8fd-e20c2c049472","Type":"ContainerStarted","Data":"6ce16b962cdf6095fe075410dd0e3e4b0df623a44b72871189f0b1d5d8146085"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.239474 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" event={"ID":"7667c5a1-aecb-4ccd-b8fd-e20c2c049472","Type":"ContainerStarted","Data":"33fd4f5f9a06222767db7dc3489718fbb676a3336f67eacb714356ad05127307"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.241765 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.247408 4898 generic.go:334] "Generic (PLEG): container finished" podID="9c7e70de-de85-421c-aaeb-476450d8e0ee" containerID="edc0dd60bc8dc83f763583bafafa796b1b5dd9cb2886beb15e2be0e325d957cf" exitCode=0 Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.247567 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" event={"ID":"9c7e70de-de85-421c-aaeb-476450d8e0ee","Type":"ContainerDied","Data":"edc0dd60bc8dc83f763583bafafa796b1b5dd9cb2886beb15e2be0e325d957cf"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.252275 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" podStartSLOduration=161.252254608 podStartE2EDuration="2m41.252254608s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.251573902 +0000 UTC m=+227.253162171" watchObservedRunningTime="2026-03-13 13:59:52.252254608 +0000 UTC m=+227.253842847" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.271993 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.272027 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.302256 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" podStartSLOduration=161.302240681 podStartE2EDuration="2m41.302240681s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.288132934 +0000 UTC m=+227.289721173" watchObservedRunningTime="2026-03-13 13:59:52.302240681 +0000 UTC m=+227.303828920" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.302664 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.302699 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" event={"ID":"dfaa00dc-cff6-47b5-878f-886fab80071b","Type":"ContainerStarted","Data":"0aa2002536ab789e03e60f2e703aea476046c489cae1f4fd2e903b848b7d698a"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.309541 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.310184 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.81016519 +0000 UTC m=+227.811753429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.324610 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" event={"ID":"c8d8e11d-3717-47fd-a5c6-b8f52f19147b","Type":"ContainerStarted","Data":"b7cd43b9b282819ed33d91bde36d4d25f251bb880f318471fa92ff31a8352b62"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.324858 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" event={"ID":"c8d8e11d-3717-47fd-a5c6-b8f52f19147b","Type":"ContainerStarted","Data":"4e827400971be56892c4197993ef9f19ebab3fc7960d56bb3ff4253587e132b4"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.325862 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" event={"ID":"f52c1025-32e7-4eba-8af4-5c5cce1918da","Type":"ContainerStarted","Data":"423e6617bc8cc210d91629b1d5580f9b4f8c3137b80892ad74315408fb41680c"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.341645 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" event={"ID":"2e6be656-c448-4b38-b5a8-2401ab767c54","Type":"ContainerStarted","Data":"9a11bce5f2e92e273dac618f990097c73c4a96c29ab227dcb58f4558e7ec86a8"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.342926 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" event={"ID":"9af6faad-479e-481b-9f66-d074c1c20ce8","Type":"ContainerStarted","Data":"76978f1451b566f71fc2abbb7c343f7a61a83b6035e763754114ff95e11eb8d2"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.343248 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" podStartSLOduration=162.343234429 podStartE2EDuration="2m42.343234429s" podCreationTimestamp="2026-03-13 13:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.340928104 +0000 UTC m=+227.342516353" watchObservedRunningTime="2026-03-13 13:59:52.343234429 +0000 UTC m=+227.344822668" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.346434 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" event={"ID":"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8","Type":"ContainerStarted","Data":"23e98bcd837ca0cb8249dcc8ed51cabc9953f3983b564532cc25114bc763ae32"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.364238 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" event={"ID":"4372b422-23c7-46bc-aec4-aef665acbda1","Type":"ContainerStarted","Data":"4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.365337 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.367644 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.385475 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" podStartSLOduration=161.385460256 podStartE2EDuration="2m41.385460256s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.384202316 +0000 UTC m=+227.385790555" watchObservedRunningTime="2026-03-13 13:59:52.385460256 +0000 UTC m=+227.387048495" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.397047 4898 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pvbpt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.397106 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" podUID="4372b422-23c7-46bc-aec4-aef665acbda1" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.414555 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" event={"ID":"aa1ed4c8-e4bd-4352-bee3-404f16244ea3","Type":"ContainerStarted","Data":"a7c95316f7425af660b27292d15441df4c43eb94ff232136664abdd1d5a272eb"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.419578 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.420434 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.92042056 +0000 UTC m=+227.922008799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.422418 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" event={"ID":"55a96934-e740-402f-b4af-488a7eba53ae","Type":"ContainerStarted","Data":"3b3138220a23e9803eb1ca1a6b020a7b0dd6879719bc2fbe404eff2fe4efc939"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.449354 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mr499" event={"ID":"d1df7055-9dee-4cde-a787-bc18a276b777","Type":"ContainerStarted","Data":"7bd79891274d50826bbb04d76734841d3066293f61a67dc86d7cbbdd1dc480d3"} Mar 13 13:59:52 crc kubenswrapper[4898]: W0313 13:59:52.466056 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8675f0f0_7d3b_41d9_959e_e73f78f32c5c.slice/crio-f535ac1d451a11e6e21776c440e07c6e0112ff22847a6ab7095f38893121a79e WatchSource:0}: Error finding container f535ac1d451a11e6e21776c440e07c6e0112ff22847a6ab7095f38893121a79e: Status 404 returned error can't find the container with id f535ac1d451a11e6e21776c440e07c6e0112ff22847a6ab7095f38893121a79e Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.472003 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hqcs6" event={"ID":"a0416bba-76e3-4312-94e0-ac5b77c6ace0","Type":"ContainerStarted","Data":"8561c0bc1283dbf759bd31dd3f74d1e87fff5cd64a0ea3c288390b3b3b05c12c"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.479581 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" event={"ID":"ad924960-c3fd-4412-9b39-0723a598d86d","Type":"ContainerStarted","Data":"876f10ccbfb56710ccdb89c75bfa8467316fa88c4af75dc2b1cc0d366ea21ebb"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.480157 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-6plhg" podStartSLOduration=161.480147285 podStartE2EDuration="2m41.480147285s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.438305917 +0000 UTC m=+227.439894156" watchObservedRunningTime="2026-03-13 13:59:52.480147285 +0000 UTC m=+227.481735524" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.482622 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" podStartSLOduration=161.482613634 podStartE2EDuration="2m41.482613634s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.479254564 +0000 UTC m=+227.480842813" watchObservedRunningTime="2026-03-13 13:59:52.482613634 +0000 UTC m=+227.484201873" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.484016 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2ps4n"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.502491 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" event={"ID":"41000ce4-1a84-44de-b283-1fe0350b1c17","Type":"ContainerStarted","Data":"5d7df65ed10cc02d6a22524199f9b24947355bb7c601bfddcebc3b5b19e1aa8f"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.503375 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8r99"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.513318 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" podStartSLOduration=161.513301376 podStartE2EDuration="2m41.513301376s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.512850425 +0000 UTC m=+227.514438674" watchObservedRunningTime="2026-03-13 13:59:52.513301376 +0000 UTC m=+227.514889605" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.513578 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.518432 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" event={"ID":"7d27543e-df10-41f7-be85-dfe319aaec8a","Type":"ContainerStarted","Data":"13bb133bf7096639e00c031dd9fdfa46c4c4b70040b20821c04b022084397e7f"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.520553 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-cx59b" event={"ID":"f4f26c0f-992a-4eb4-86d2-58e42a5b2b68","Type":"ContainerStarted","Data":"68f2d5a5e0268af93ce98f3742986642ab5cf81659e9919ba09dd20dc1ddf9a9"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.521433 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-cx59b" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.522609 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.524535 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.024517804 +0000 UTC m=+228.026106043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.551706 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.552015 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.553082 4898 generic.go:334] "Generic (PLEG): container finished" podID="f446713d-03e3-461f-989f-eb6bdef32b30" containerID="240733896e8454525dba9569b24980e27aade8613f126f3a63438c9e9d8e7534" exitCode=0 Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.553337 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" event={"ID":"f446713d-03e3-461f-989f-eb6bdef32b30","Type":"ContainerDied","Data":"240733896e8454525dba9569b24980e27aade8613f126f3a63438c9e9d8e7534"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.554406 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" podStartSLOduration=162.554386546 podStartE2EDuration="2m42.554386546s" podCreationTimestamp="2026-03-13 13:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.552581113 +0000 UTC m=+227.554169352" watchObservedRunningTime="2026-03-13 13:59:52.554386546 +0000 UTC m=+227.555974785" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.566744 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" event={"ID":"66ebc90f-88a0-476c-98d6-c595517196b3","Type":"ContainerStarted","Data":"68420b522d54c9a1d3e18efff552a67f7a9121eca607b63a478f1c66c056daff"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.578711 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" podStartSLOduration=162.578693506 podStartE2EDuration="2m42.578693506s" podCreationTimestamp="2026-03-13 13:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.577110378 +0000 UTC m=+227.578698627" watchObservedRunningTime="2026-03-13 13:59:52.578693506 +0000 UTC m=+227.580281745" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.591298 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.628330 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:52 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:52 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:52 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.628394 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.628653 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44122: no serving certificate available for the kubelet" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.634866 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.639323 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.139308352 +0000 UTC m=+228.140896591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: W0313 13:59:52.676862 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6456920b_69b3_4ce9_9eaa_ad8e0fde2aa4.slice/crio-2160c8e1fb1829a683300b17998f45a9fce220d710c9e5e5237b085cb28cf8ea WatchSource:0}: Error finding container 2160c8e1fb1829a683300b17998f45a9fce220d710c9e5e5237b085cb28cf8ea: Status 404 returned error can't find the container with id 2160c8e1fb1829a683300b17998f45a9fce220d710c9e5e5237b085cb28cf8ea Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.687941 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podStartSLOduration=161.687920322 podStartE2EDuration="2m41.687920322s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.629657362 +0000 UTC m=+227.631245611" watchObservedRunningTime="2026-03-13 13:59:52.687920322 +0000 UTC m=+227.689508561" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.713978 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" podStartSLOduration=161.713960713 podStartE2EDuration="2m41.713960713s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.689659824 +0000 UTC m=+227.691248073" watchObservedRunningTime="2026-03-13 13:59:52.713960713 +0000 UTC m=+227.715548952" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.735974 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.737127 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.237087865 +0000 UTC m=+228.238676104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.749506 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" podStartSLOduration=161.749486951 podStartE2EDuration="2m41.749486951s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.745183578 +0000 UTC m=+227.746771817" watchObservedRunningTime="2026-03-13 13:59:52.749486951 +0000 UTC m=+227.751075190" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.770639 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-cx59b" podStartSLOduration=161.770620065 podStartE2EDuration="2m41.770620065s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.770365739 +0000 UTC m=+227.771953988" watchObservedRunningTime="2026-03-13 13:59:52.770620065 +0000 UTC m=+227.772208304" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.801066 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-mr499" podStartSLOduration=5.801048971 podStartE2EDuration="5.801048971s" podCreationTimestamp="2026-03-13 13:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.79891171 +0000 UTC m=+227.800499969" watchObservedRunningTime="2026-03-13 13:59:52.801048971 +0000 UTC m=+227.802637210" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.838256 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.840601 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.340585564 +0000 UTC m=+228.342173803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.869002 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.947221 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.947608 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.447593017 +0000 UTC m=+228.449181256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.048593 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.049227 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.549196781 +0000 UTC m=+228.550785020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.150841 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.151508 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.651476962 +0000 UTC m=+228.653065211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.151619 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.152495 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.652487056 +0000 UTC m=+228.654075295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.253061 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.253367 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.753343242 +0000 UTC m=+228.754931481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.262768 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pvbpt"] Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.303312 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4"] Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.354691 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.355162 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.855144671 +0000 UTC m=+228.856732980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.455845 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.456221 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.956184601 +0000 UTC m=+228.957772840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.557648 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.558205 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.05800274 +0000 UTC m=+229.059590969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.614766 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:53 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:53 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:53 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.615051 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.642503 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" event={"ID":"2e6be656-c448-4b38-b5a8-2401ab767c54","Type":"ContainerStarted","Data":"a1c32abd97dd846ca3e993c5a91ad71bd349858f81e83b81092a61a9354172a2"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.645385 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" event={"ID":"f52c1025-32e7-4eba-8af4-5c5cce1918da","Type":"ContainerStarted","Data":"f1f6a6de92d72265f3f98ec19ac9f23972e5558307ce72451eab813ae76ff6a4"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.654072 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2ps4n" event={"ID":"22cb0051-a6f4-4790-b51c-3da149327edd","Type":"ContainerStarted","Data":"59364e2f79d4987c7022bfb3575d49bc1b133d0063ed88f1f04f0b961f1501d9"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.654124 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2ps4n" event={"ID":"22cb0051-a6f4-4790-b51c-3da149327edd","Type":"ContainerStarted","Data":"10eb5bf6ae08d26f1b282a1735794b2c96f9f7f0e680a841fecd1ac22fc191c0"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.660975 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.661369 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.161351256 +0000 UTC m=+229.162939495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.665122 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" podStartSLOduration=162.665106566 podStartE2EDuration="2m42.665106566s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:53.663863616 +0000 UTC m=+228.665451855" watchObservedRunningTime="2026-03-13 13:59:53.665106566 +0000 UTC m=+228.666694805" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.728642 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" event={"ID":"b26a4d77-f170-467e-ad96-4741cc5a8f23","Type":"ContainerStarted","Data":"e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.729493 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.730395 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2ps4n" podStartSLOduration=6.730380113 podStartE2EDuration="6.730380113s" podCreationTimestamp="2026-03-13 13:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:53.729683647 +0000 UTC m=+228.731271886" watchObservedRunningTime="2026-03-13 13:59:53.730380113 +0000 UTC m=+228.731968352" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.736908 4898 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-djn5q container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.38:6443/healthz\": dial tcp 10.217.0.38:6443: connect: connection refused" start-of-body= Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.736969 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.38:6443/healthz\": dial tcp 10.217.0.38:6443: connect: connection refused" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.758889 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" podStartSLOduration=162.758867783 podStartE2EDuration="2m42.758867783s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:53.756713861 +0000 UTC m=+228.758302110" watchObservedRunningTime="2026-03-13 13:59:53.758867783 +0000 UTC m=+228.760456022" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.761090 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hqcs6" event={"ID":"a0416bba-76e3-4312-94e0-ac5b77c6ace0","Type":"ContainerStarted","Data":"d938659f66c504e3a53f10404f4cb9b8a7f81da245338b7fc04aa817f3df0247"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.763792 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.765053 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.26503464 +0000 UTC m=+229.266622879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.774800 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" event={"ID":"c8b0b1cf-022c-4181-a957-2f7e172a3294","Type":"ContainerStarted","Data":"533f9c0d5d010c2b5018f71aab66fa3025d2e9a7d36d33fe0f8634b0b2bbb7ec"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.775755 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.808751 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.845209 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" podStartSLOduration=163.845196142 podStartE2EDuration="2m43.845196142s" podCreationTimestamp="2026-03-13 13:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:53.844124497 +0000 UTC m=+228.845712736" watchObservedRunningTime="2026-03-13 13:59:53.845196142 +0000 UTC m=+228.846784381" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.847281 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" event={"ID":"f446713d-03e3-461f-989f-eb6bdef32b30","Type":"ContainerStarted","Data":"3d78f2eae4339faf9da85d3605b5240a3df3c676cc009629b7c0efcf3ad06e0b"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.866271 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" event={"ID":"ad924960-c3fd-4412-9b39-0723a598d86d","Type":"ContainerStarted","Data":"6101b0b070abdd17359194fe8ea3f377c21f03dac740dfad3caca468dc4d3c9e"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.868581 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.869772 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.369737798 +0000 UTC m=+229.371326037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.913509 4898 generic.go:334] "Generic (PLEG): container finished" podID="1a01ab05-7178-48c7-892b-b91cf60432f8" containerID="ea0552ede8b56b12eb135cbe5901bf93d8c601f35fa5916a4dd9fe23a332df86" exitCode=0 Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.913614 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" event={"ID":"1a01ab05-7178-48c7-892b-b91cf60432f8","Type":"ContainerDied","Data":"ea0552ede8b56b12eb135cbe5901bf93d8c601f35fa5916a4dd9fe23a332df86"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.933195 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" event={"ID":"dfaa00dc-cff6-47b5-878f-886fab80071b","Type":"ContainerStarted","Data":"d17d5d04b082cf9325120049501e4885bddbd9a55bc9df7ea4ad2f54754d8fc5"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.964911 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" event={"ID":"d1225410-7280-4409-8934-c6766eae5088","Type":"ContainerStarted","Data":"e3d55e6af58dc24c9eb37bd3f1da62cc7fa0169219865d3d83df1e09ea590f59"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.973739 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.974062 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.474051457 +0000 UTC m=+229.475639696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.010945 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" event={"ID":"0a78868f-1786-430d-8df8-18bb1c2019b3","Type":"ContainerStarted","Data":"66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.011192 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" event={"ID":"0a78868f-1786-430d-8df8-18bb1c2019b3","Type":"ContainerStarted","Data":"5ce4caec01bc9ee8df0b59f3f0251f9037b82e485a55597652071608caca296b"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.011942 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.023360 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-p8r99 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.023413 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.058534 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podStartSLOduration=163.058517582 podStartE2EDuration="2m43.058517582s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:53.880564596 +0000 UTC m=+228.882152845" watchObservedRunningTime="2026-03-13 13:59:54.058517582 +0000 UTC m=+229.060105821" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.061650 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" event={"ID":"7d27543e-df10-41f7-be85-dfe319aaec8a","Type":"ContainerStarted","Data":"f6246f6fbdbf725301f1e7d37222d66624c9ce34cacd75b24a3bb142ed798d64"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.062438 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.074571 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.075498 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.575482407 +0000 UTC m=+229.577070646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.109923 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" event={"ID":"9af6faad-479e-481b-9f66-d074c1c20ce8","Type":"ContainerStarted","Data":"bfffbc1d47cf562161ad401b19d456f5803628cdc3e9b93364d303051da7fdc5"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.109965 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" event={"ID":"9af6faad-479e-481b-9f66-d074c1c20ce8","Type":"ContainerStarted","Data":"add114cee616c51a7db53d7f91b4b937bf93df887dadd93903309e7753a0dcc6"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.120032 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" podStartSLOduration=163.120017719 podStartE2EDuration="2m43.120017719s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.057337274 +0000 UTC m=+229.058925523" watchObservedRunningTime="2026-03-13 13:59:54.120017719 +0000 UTC m=+229.121605948" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.121941 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.123730 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mr499" event={"ID":"d1df7055-9dee-4cde-a787-bc18a276b777","Type":"ContainerStarted","Data":"132a21d8d3ccecb68f50362d993af69d42f4875b110ef71f981d6813b55a54b3"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.156733 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44132: no serving certificate available for the kubelet" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.162114 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" event={"ID":"f794406f-fc28-4e2f-953d-ab45e36cc754","Type":"ContainerStarted","Data":"4d3e29597b67d8aa33de474d704625dc02977b6f8f28478f0d39a71a3fa00e40"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.162156 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" event={"ID":"f794406f-fc28-4e2f-953d-ab45e36cc754","Type":"ContainerStarted","Data":"1f5592600f897f3e06e6f8ad157f32be4832808e055795a6024b5a5e83426cec"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.162165 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" event={"ID":"f794406f-fc28-4e2f-953d-ab45e36cc754","Type":"ContainerStarted","Data":"39769e11acaa214e87052d1bb5ad3d3b71dc8c697697151a19a7d0822241a4f3"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.167221 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" podStartSLOduration=163.167206385 podStartE2EDuration="2m43.167206385s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.145344293 +0000 UTC m=+229.146932532" watchObservedRunningTime="2026-03-13 13:59:54.167206385 +0000 UTC m=+229.168794614" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.177674 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.178504 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.678493704 +0000 UTC m=+229.680081943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.180409 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" event={"ID":"6444bf97-84ef-49df-afcd-4e939a5de2ad","Type":"ContainerStarted","Data":"e724d5addb1cd6dacb56928656961ec003b4cee18b292aff9da189cba7cb8b7f"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.199630 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" podStartSLOduration=163.199613418 podStartE2EDuration="2m43.199613418s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.168379953 +0000 UTC m=+229.169968192" watchObservedRunningTime="2026-03-13 13:59:54.199613418 +0000 UTC m=+229.201201657" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.200880 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" podStartSLOduration=163.200873318 podStartE2EDuration="2m43.200873318s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.1984477 +0000 UTC m=+229.200035959" watchObservedRunningTime="2026-03-13 13:59:54.200873318 +0000 UTC m=+229.202461557" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.217785 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" event={"ID":"c8d8e11d-3717-47fd-a5c6-b8f52f19147b","Type":"ContainerStarted","Data":"09fe000f8256c50a8bbef94a79acc7883412066f8bca0684a09af0e3a280eab6"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.249374 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" event={"ID":"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4","Type":"ContainerStarted","Data":"2160c8e1fb1829a683300b17998f45a9fce220d710c9e5e5237b085cb28cf8ea"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.283789 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.284871 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.784856932 +0000 UTC m=+229.786445171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.325194 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7l2pm" event={"ID":"0ea2e803-34d0-429b-b943-ece0b9e38b63","Type":"ContainerStarted","Data":"5f0a71c3382b8e97b2f21cf59a246a72cf36bc90c37659a0655800a7772d93ae"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.327811 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" podStartSLOduration=163.327797926 podStartE2EDuration="2m43.327797926s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.27722418 +0000 UTC m=+229.278812419" watchObservedRunningTime="2026-03-13 13:59:54.327797926 +0000 UTC m=+229.329386165" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.336236 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" event={"ID":"071d8651-2a2d-4eed-9023-cfe636be09a0","Type":"ContainerStarted","Data":"bd32fe2cef365eaaed01cc9a800c7901ae0b0ae4565b3977e2bf9b20a1257ddb"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.336281 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" event={"ID":"071d8651-2a2d-4eed-9023-cfe636be09a0","Type":"ContainerStarted","Data":"8a68531c4432e244d55d51c54210fb9cf9a046d91a04990d04c320d1d1ca0793"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.339952 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" event={"ID":"9c7e70de-de85-421c-aaeb-476450d8e0ee","Type":"ContainerStarted","Data":"ccd7403e1f9b432e946d026a2dfcc99e0f3285803db614b12fc60317b1f5fb3e"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.340416 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.341338 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" event={"ID":"66ebc90f-88a0-476c-98d6-c595517196b3","Type":"ContainerStarted","Data":"0f877df58d3f631f3764abf0d3113bee627f9006e57760ab88188bf736ee1928"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.342589 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" event={"ID":"8675f0f0-7d3b-41d9-959e-e73f78f32c5c","Type":"ContainerStarted","Data":"f9fd46726abc0faa9f794ff1133bafe631cc262546158876056b7a4740033c49"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.342607 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" event={"ID":"8675f0f0-7d3b-41d9-959e-e73f78f32c5c","Type":"ContainerStarted","Data":"f535ac1d451a11e6e21776c440e07c6e0112ff22847a6ab7095f38893121a79e"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.342942 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.359878 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" podStartSLOduration=163.359862641 podStartE2EDuration="2m43.359862641s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.327927179 +0000 UTC m=+229.329515428" watchObservedRunningTime="2026-03-13 13:59:54.359862641 +0000 UTC m=+229.361450880" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.369290 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" event={"ID":"55a96934-e740-402f-b4af-488a7eba53ae","Type":"ContainerStarted","Data":"9c5eb8d0b347ab2a909e923065a6b311c93560fd69fbc82c9cff1576bbb6917e"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.376305 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.376350 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.384072 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" podStartSLOduration=163.384055728 podStartE2EDuration="2m43.384055728s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.361002268 +0000 UTC m=+229.362590517" watchObservedRunningTime="2026-03-13 13:59:54.384055728 +0000 UTC m=+229.385643967" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.385312 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.386769 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.886758633 +0000 UTC m=+229.888346872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.387168 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podStartSLOduration=163.387157282 podStartE2EDuration="2m43.387157282s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.383575537 +0000 UTC m=+229.385163786" watchObservedRunningTime="2026-03-13 13:59:54.387157282 +0000 UTC m=+229.388745521" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.435256 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.442512 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" podStartSLOduration=163.442492043 podStartE2EDuration="2m43.442492043s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.442470282 +0000 UTC m=+229.444058521" watchObservedRunningTime="2026-03-13 13:59:54.442492043 +0000 UTC m=+229.444080272" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.482841 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-7l2pm" podStartSLOduration=163.482827455 podStartE2EDuration="2m43.482827455s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.482266812 +0000 UTC m=+229.483855061" watchObservedRunningTime="2026-03-13 13:59:54.482827455 +0000 UTC m=+229.484415694" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.490430 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.490541 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.990525009 +0000 UTC m=+229.992113248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.490768 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.492132 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.496795 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.996777088 +0000 UTC m=+229.998365407 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.529190 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" podStartSLOduration=163.529166961 podStartE2EDuration="2m43.529166961s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.508791364 +0000 UTC m=+229.510379593" watchObservedRunningTime="2026-03-13 13:59:54.529166961 +0000 UTC m=+229.530755200" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.553117 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" podStartSLOduration=163.553099731 podStartE2EDuration="2m43.553099731s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.550572621 +0000 UTC m=+229.552160890" watchObservedRunningTime="2026-03-13 13:59:54.553099731 +0000 UTC m=+229.554687970" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.588980 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" podStartSLOduration=163.588963887 podStartE2EDuration="2m43.588963887s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.58740961 +0000 UTC m=+229.588997859" watchObservedRunningTime="2026-03-13 13:59:54.588963887 +0000 UTC m=+229.590552126" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.603522 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.603853 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.103837992 +0000 UTC m=+230.105426231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.620629 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:54 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:54 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:54 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.620698 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.666166 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" podStartSLOduration=163.666151849 podStartE2EDuration="2m43.666151849s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.628774207 +0000 UTC m=+229.630362446" watchObservedRunningTime="2026-03-13 13:59:54.666151849 +0000 UTC m=+229.667740088" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.696722 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" podStartSLOduration=163.696709118 podStartE2EDuration="2m43.696709118s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.695265513 +0000 UTC m=+229.696853752" watchObservedRunningTime="2026-03-13 13:59:54.696709118 +0000 UTC m=+229.698297357" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.697426 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" podStartSLOduration=163.697421755 podStartE2EDuration="2m43.697421755s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.666228981 +0000 UTC m=+229.667817220" watchObservedRunningTime="2026-03-13 13:59:54.697421755 +0000 UTC m=+229.699009994" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.705014 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.705564 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.205532328 +0000 UTC m=+230.207120567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.806634 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.806993 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.306972498 +0000 UTC m=+230.308560737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.908536 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.908860 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.408848788 +0000 UTC m=+230.410437027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.010377 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.010733 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.510718258 +0000 UTC m=+230.512306497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.111745 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.112065 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.612051446 +0000 UTC m=+230.613639685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.212730 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.212891 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.712869371 +0000 UTC m=+230.714457610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.212960 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.213274 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.713266421 +0000 UTC m=+230.714854650 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.313964 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.314148 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.814123217 +0000 UTC m=+230.815711456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.314260 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.314557 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.814544867 +0000 UTC m=+230.816133156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.396556 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" event={"ID":"8675f0f0-7d3b-41d9-959e-e73f78f32c5c","Type":"ContainerStarted","Data":"df17383199ef22fb611b55de12da7719f69c6330cdc883aedd63b68d3ea9f8af"} Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.403215 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" event={"ID":"41000ce4-1a84-44de-b283-1fe0350b1c17","Type":"ContainerStarted","Data":"d188a49a89641237218187334ef909cdfab8bca3de66a1f8cca460d12d796c34"} Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.405250 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" event={"ID":"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4","Type":"ContainerStarted","Data":"9ebdc21a89c7e542370d6c291c23a4de0ed142f28fe324f890633805c6eafc8e"} Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.415119 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.415294 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.91527509 +0000 UTC m=+230.916863329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.439786 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" event={"ID":"55a96934-e740-402f-b4af-488a7eba53ae","Type":"ContainerStarted","Data":"642f2f07742827ed95ef31168363d6545f55661dce6a87f6018188393f80a952"} Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.454582 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" event={"ID":"f446713d-03e3-461f-989f-eb6bdef32b30","Type":"ContainerStarted","Data":"f099af6c5b50a4be197d635f5850b6183333a053a6af427c07f76da96d5e7c7b"} Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.465470 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hqcs6" event={"ID":"a0416bba-76e3-4312-94e0-ac5b77c6ace0","Type":"ContainerStarted","Data":"c522c92b343ad9fc99d411c2d428c314bb922018994e10fce74dbb04fb415613"} Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.466035 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.480165 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" event={"ID":"1a01ab05-7178-48c7-892b-b91cf60432f8","Type":"ContainerStarted","Data":"454ed8ccf9bcb2dc3908edb26a397efc55c2e238824780ec87ba7d56db753dbe"} Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.481348 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-p8r99 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.481400 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.481798 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" podUID="4372b422-23c7-46bc-aec4-aef665acbda1" containerName="controller-manager" containerID="cri-o://4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9" gracePeriod=30 Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.482864 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.482921 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.482959 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" podUID="1607f924-1e24-4848-b811-21ac3a7f8999" containerName="route-controller-manager" containerID="cri-o://4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e" gracePeriod=30 Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.500789 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.516863 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.517280 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.017265424 +0000 UTC m=+231.018853673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.587428 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" podStartSLOduration=164.587410287 podStartE2EDuration="2m44.587410287s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:55.572646785 +0000 UTC m=+230.574235034" watchObservedRunningTime="2026-03-13 13:59:55.587410287 +0000 UTC m=+230.588998526" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.588954 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" podStartSLOduration=164.588947984 podStartE2EDuration="2m44.588947984s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:55.529451514 +0000 UTC m=+230.531039763" watchObservedRunningTime="2026-03-13 13:59:55.588947984 +0000 UTC m=+230.590536223" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.608158 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hqcs6" podStartSLOduration=8.608141012 podStartE2EDuration="8.608141012s" podCreationTimestamp="2026-03-13 13:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:55.606845151 +0000 UTC m=+230.608433400" watchObservedRunningTime="2026-03-13 13:59:55.608141012 +0000 UTC m=+230.609729251" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.614968 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:55 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:55 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:55 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.615005 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.619732 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.619862 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.119835681 +0000 UTC m=+231.121423920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.620037 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.623174 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.12316314 +0000 UTC m=+231.124751379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.722572 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.722965 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.222949811 +0000 UTC m=+231.224538050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.823588 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.823915 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.323889679 +0000 UTC m=+231.325477918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.883862 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-twh8h"] Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.884749 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.895572 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.907176 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twh8h"] Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.927283 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.927557 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-catalog-content\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.927607 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-utilities\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.927644 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x728\" (UniqueName: \"kubernetes.io/projected/8f81bcfc-3c35-48e8-a584-961351e8c0e2-kube-api-access-5x728\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.927734 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.427721576 +0000 UTC m=+231.429309815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.039778 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.039833 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-catalog-content\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.039874 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-utilities\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.039929 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x728\" (UniqueName: \"kubernetes.io/projected/8f81bcfc-3c35-48e8-a584-961351e8c0e2-kube-api-access-5x728\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.040195 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.540177329 +0000 UTC m=+231.541765568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.040602 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-catalog-content\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.040656 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-utilities\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.066217 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dvvz2"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.068253 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.070163 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.079763 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dvvz2"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.086866 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x728\" (UniqueName: \"kubernetes.io/projected/8f81bcfc-3c35-48e8-a584-961351e8c0e2-kube-api-access-5x728\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.140204 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.140390 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhlq2\" (UniqueName: \"kubernetes.io/projected/43acaee8-efc8-4156-b28c-b493f241ac53-kube-api-access-zhlq2\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.140460 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-utilities\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.140487 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-catalog-content\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.140568 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.640554454 +0000 UTC m=+231.642142693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.157934 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.165001 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.195633 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq"] Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.195811 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1607f924-1e24-4848-b811-21ac3a7f8999" containerName="route-controller-manager" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.195823 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1607f924-1e24-4848-b811-21ac3a7f8999" containerName="route-controller-manager" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.195835 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4372b422-23c7-46bc-aec4-aef665acbda1" containerName="controller-manager" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.195843 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4372b422-23c7-46bc-aec4-aef665acbda1" containerName="controller-manager" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.196221 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1607f924-1e24-4848-b811-21ac3a7f8999" containerName="route-controller-manager" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.196246 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4372b422-23c7-46bc-aec4-aef665acbda1" containerName="controller-manager" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.196562 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.217224 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.236157 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241053 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-client-ca\") pod \"1607f924-1e24-4848-b811-21ac3a7f8999\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241096 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkqfk\" (UniqueName: \"kubernetes.io/projected/1607f924-1e24-4848-b811-21ac3a7f8999-kube-api-access-bkqfk\") pod \"1607f924-1e24-4848-b811-21ac3a7f8999\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241144 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-config\") pod \"4372b422-23c7-46bc-aec4-aef665acbda1\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241167 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-proxy-ca-bundles\") pod \"4372b422-23c7-46bc-aec4-aef665acbda1\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241190 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1607f924-1e24-4848-b811-21ac3a7f8999-serving-cert\") pod \"1607f924-1e24-4848-b811-21ac3a7f8999\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241220 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv689\" (UniqueName: \"kubernetes.io/projected/4372b422-23c7-46bc-aec4-aef665acbda1-kube-api-access-vv689\") pod \"4372b422-23c7-46bc-aec4-aef665acbda1\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241254 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-client-ca\") pod \"4372b422-23c7-46bc-aec4-aef665acbda1\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241279 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-config\") pod \"1607f924-1e24-4848-b811-21ac3a7f8999\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241365 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4372b422-23c7-46bc-aec4-aef665acbda1-serving-cert\") pod \"4372b422-23c7-46bc-aec4-aef665acbda1\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241463 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhlq2\" (UniqueName: \"kubernetes.io/projected/43acaee8-efc8-4156-b28c-b493f241ac53-kube-api-access-zhlq2\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241488 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-client-ca\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241526 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4gbm\" (UniqueName: \"kubernetes.io/projected/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-kube-api-access-n4gbm\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241544 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-serving-cert\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241576 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-config\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241598 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-utilities\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241619 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241637 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-catalog-content\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241986 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-catalog-content\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.242532 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-client-ca" (OuterVolumeSpecName: "client-ca") pod "1607f924-1e24-4848-b811-21ac3a7f8999" (UID: "1607f924-1e24-4848-b811-21ac3a7f8999"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.244787 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-client-ca" (OuterVolumeSpecName: "client-ca") pod "4372b422-23c7-46bc-aec4-aef665acbda1" (UID: "4372b422-23c7-46bc-aec4-aef665acbda1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.244785 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-config" (OuterVolumeSpecName: "config") pod "1607f924-1e24-4848-b811-21ac3a7f8999" (UID: "1607f924-1e24-4848-b811-21ac3a7f8999"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.244817 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-utilities\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.245096 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.745083488 +0000 UTC m=+231.746671727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.245450 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4372b422-23c7-46bc-aec4-aef665acbda1" (UID: "4372b422-23c7-46bc-aec4-aef665acbda1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.245831 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-config" (OuterVolumeSpecName: "config") pod "4372b422-23c7-46bc-aec4-aef665acbda1" (UID: "4372b422-23c7-46bc-aec4-aef665acbda1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.251683 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4372b422-23c7-46bc-aec4-aef665acbda1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4372b422-23c7-46bc-aec4-aef665acbda1" (UID: "4372b422-23c7-46bc-aec4-aef665acbda1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.251779 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1607f924-1e24-4848-b811-21ac3a7f8999-kube-api-access-bkqfk" (OuterVolumeSpecName: "kube-api-access-bkqfk") pod "1607f924-1e24-4848-b811-21ac3a7f8999" (UID: "1607f924-1e24-4848-b811-21ac3a7f8999"). InnerVolumeSpecName "kube-api-access-bkqfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.262784 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4372b422-23c7-46bc-aec4-aef665acbda1-kube-api-access-vv689" (OuterVolumeSpecName: "kube-api-access-vv689") pod "4372b422-23c7-46bc-aec4-aef665acbda1" (UID: "4372b422-23c7-46bc-aec4-aef665acbda1"). InnerVolumeSpecName "kube-api-access-vv689". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.262847 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1607f924-1e24-4848-b811-21ac3a7f8999-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1607f924-1e24-4848-b811-21ac3a7f8999" (UID: "1607f924-1e24-4848-b811-21ac3a7f8999"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.268938 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhlq2\" (UniqueName: \"kubernetes.io/projected/43acaee8-efc8-4156-b28c-b493f241ac53-kube-api-access-zhlq2\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.269944 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ppq6v"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.270744 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.288273 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ppq6v"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.342650 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.342919 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-utilities\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.342956 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-config\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.342975 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-catalog-content\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343024 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-client-ca\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343067 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4gbm\" (UniqueName: \"kubernetes.io/projected/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-kube-api-access-n4gbm\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343085 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7vmd\" (UniqueName: \"kubernetes.io/projected/a990881e-0caf-4096-a372-4cdad69006c1-kube-api-access-x7vmd\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343102 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-serving-cert\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343137 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343149 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkqfk\" (UniqueName: \"kubernetes.io/projected/1607f924-1e24-4848-b811-21ac3a7f8999-kube-api-access-bkqfk\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343160 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343170 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343179 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1607f924-1e24-4848-b811-21ac3a7f8999-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343186 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv689\" (UniqueName: \"kubernetes.io/projected/4372b422-23c7-46bc-aec4-aef665acbda1-kube-api-access-vv689\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343194 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343204 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343212 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4372b422-23c7-46bc-aec4-aef665acbda1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.343561 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.843546717 +0000 UTC m=+231.845134956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.344291 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-client-ca\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.344479 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-config\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.348233 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-serving-cert\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.370583 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4gbm\" (UniqueName: \"kubernetes.io/projected/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-kube-api-access-n4gbm\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.440415 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.444987 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7vmd\" (UniqueName: \"kubernetes.io/projected/a990881e-0caf-4096-a372-4cdad69006c1-kube-api-access-x7vmd\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.445049 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-utilities\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.445075 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-catalog-content\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.445106 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.445463 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.945449658 +0000 UTC m=+231.947037897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.446139 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-utilities\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.446319 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-catalog-content\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.470790 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7vmd\" (UniqueName: \"kubernetes.io/projected/a990881e-0caf-4096-a372-4cdad69006c1-kube-api-access-x7vmd\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.480938 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xh84s"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.482022 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.489514 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xh84s"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.521248 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.545322 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.546121 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.546311 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-utilities\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.546359 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nn9n\" (UniqueName: \"kubernetes.io/projected/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-kube-api-access-5nn9n\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.546399 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-catalog-content\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.546520 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:57.046506169 +0000 UTC m=+232.048094408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.614154 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.631685 4898 generic.go:334] "Generic (PLEG): container finished" podID="1607f924-1e24-4848-b811-21ac3a7f8999" containerID="4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e" exitCode=0 Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.631739 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" event={"ID":"1607f924-1e24-4848-b811-21ac3a7f8999","Type":"ContainerDied","Data":"4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e"} Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.631802 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" event={"ID":"1607f924-1e24-4848-b811-21ac3a7f8999","Type":"ContainerDied","Data":"ea436643cbf70a5d9613211ff3822168c4bccbc29674c27f320c9a9a2e6fcdbf"} Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.631801 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.631824 4898 scope.go:117] "RemoveContainer" containerID="4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.635973 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:56 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:56 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:56 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.636039 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.644430 4898 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.649992 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-catalog-content\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.650076 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.650139 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-utilities\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.650377 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nn9n\" (UniqueName: \"kubernetes.io/projected/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-kube-api-access-5nn9n\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.651434 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:57.151419082 +0000 UTC m=+232.153007321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.651804 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-utilities\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.653166 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-catalog-content\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.672503 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" event={"ID":"41000ce4-1a84-44de-b283-1fe0350b1c17","Type":"ContainerStarted","Data":"7edbe6333ef083f7ac085e082e2b761601f88273dd2bfb4af58267d359fc3a4d"} Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.672766 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" event={"ID":"41000ce4-1a84-44de-b283-1fe0350b1c17","Type":"ContainerStarted","Data":"0e43a3819750890872b613d51ec964234dbb52d5010a001f26b9ad677d458ca0"} Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.679998 4898 generic.go:334] "Generic (PLEG): container finished" podID="4372b422-23c7-46bc-aec4-aef665acbda1" containerID="4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9" exitCode=0 Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.680270 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.680626 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" event={"ID":"4372b422-23c7-46bc-aec4-aef665acbda1","Type":"ContainerDied","Data":"4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9"} Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.680784 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" event={"ID":"4372b422-23c7-46bc-aec4-aef665acbda1","Type":"ContainerDied","Data":"ee22c587d8deac6e60488ead6927720218bbd5dcec6d2870cf91f10d1559c75e"} Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.680631 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nn9n\" (UniqueName: \"kubernetes.io/projected/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-kube-api-access-5nn9n\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.708561 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.716818 4898 scope.go:117] "RemoveContainer" containerID="4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.717304 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e\": container with ID starting with 4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e not found: ID does not exist" containerID="4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.717371 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e"} err="failed to get container status \"4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e\": rpc error: code = NotFound desc = could not find container \"4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e\": container with ID starting with 4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e not found: ID does not exist" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.717545 4898 scope.go:117] "RemoveContainer" containerID="4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.740821 4898 scope.go:117] "RemoveContainer" containerID="4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.742841 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9\": container with ID starting with 4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9 not found: ID does not exist" containerID="4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.742961 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9"} err="failed to get container status \"4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9\": rpc error: code = NotFound desc = could not find container \"4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9\": container with ID starting with 4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9 not found: ID does not exist" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.750867 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.752417 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:57.252402131 +0000 UTC m=+232.253990370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.759630 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pvbpt"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.765455 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pvbpt"] Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.772107 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1607f924_1e24_4848_b811_21ac3a7f8999.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1607f924_1e24_4848_b811_21ac3a7f8999.slice/crio-ea436643cbf70a5d9613211ff3822168c4bccbc29674c27f320c9a9a2e6fcdbf\": RecentStats: unable to find data in memory cache]" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.772878 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.772996 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.785019 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44142: no serving certificate available for the kubelet" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.803986 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.833121 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twh8h"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.859099 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.859504 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:57.359490506 +0000 UTC m=+232.361078745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.956956 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.959765 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.960168 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:57.460151258 +0000 UTC m=+232.461739497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.002979 4898 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-13T13:59:56.644446516Z","Handler":null,"Name":""} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.006653 4898 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.006688 4898 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.019838 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ppq6v"] Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.043052 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dvvz2"] Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.061265 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.064629 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.064669 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.073517 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xh84s"] Mar 13 13:59:57 crc kubenswrapper[4898]: W0313 13:59:57.084789 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ae77efc_55ca_4eee_8817_9c21d0bafa6e.slice/crio-45fc69d27eaeb1e52f659215a5860c090893736d0fa5aca134749f73422aadc9 WatchSource:0}: Error finding container 45fc69d27eaeb1e52f659215a5860c090893736d0fa5aca134749f73422aadc9: Status 404 returned error can't find the container with id 45fc69d27eaeb1e52f659215a5860c090893736d0fa5aca134749f73422aadc9 Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.120026 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.162878 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.171461 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.201681 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.423602 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6n228"] Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.613623 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:57 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:57 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:57 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.613889 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.704485 4898 generic.go:334] "Generic (PLEG): container finished" podID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerID="af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380" exitCode=0 Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.704547 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twh8h" event={"ID":"8f81bcfc-3c35-48e8-a584-961351e8c0e2","Type":"ContainerDied","Data":"af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.704570 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twh8h" event={"ID":"8f81bcfc-3c35-48e8-a584-961351e8c0e2","Type":"ContainerStarted","Data":"81fb34feaf2adf00d5d07da217b484c8e9d6cdeb7a039901668613864eddf170"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.710368 4898 generic.go:334] "Generic (PLEG): container finished" podID="43acaee8-efc8-4156-b28c-b493f241ac53" containerID="2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf" exitCode=0 Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.710529 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvvz2" event={"ID":"43acaee8-efc8-4156-b28c-b493f241ac53","Type":"ContainerDied","Data":"2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.710560 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvvz2" event={"ID":"43acaee8-efc8-4156-b28c-b493f241ac53","Type":"ContainerStarted","Data":"6d70382f54646dad1c6a01020a09851e8f00eda076ad91d5aba2e586ae668444"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.712133 4898 generic.go:334] "Generic (PLEG): container finished" podID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerID="359074a54fdd2abe01e5471c8009872f5ca05eb132b157ad005435e3bc55c0f9" exitCode=0 Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.712181 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh84s" event={"ID":"4ae77efc-55ca-4eee-8817-9c21d0bafa6e","Type":"ContainerDied","Data":"359074a54fdd2abe01e5471c8009872f5ca05eb132b157ad005435e3bc55c0f9"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.712201 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh84s" event={"ID":"4ae77efc-55ca-4eee-8817-9c21d0bafa6e","Type":"ContainerStarted","Data":"45fc69d27eaeb1e52f659215a5860c090893736d0fa5aca134749f73422aadc9"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.715502 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" event={"ID":"41000ce4-1a84-44de-b283-1fe0350b1c17","Type":"ContainerStarted","Data":"3613fad8930f3f95230fbc9488748abf70ebc3c0e1bf2d4df0f2884fc85bed77"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.717498 4898 generic.go:334] "Generic (PLEG): container finished" podID="a990881e-0caf-4096-a372-4cdad69006c1" containerID="3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263" exitCode=0 Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.717549 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppq6v" event={"ID":"a990881e-0caf-4096-a372-4cdad69006c1","Type":"ContainerDied","Data":"3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.717569 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppq6v" event={"ID":"a990881e-0caf-4096-a372-4cdad69006c1","Type":"ContainerStarted","Data":"3acac09dab7fc6e01d8b6bf7a368fc3881544da372e2f3a95826c1fc007510c2"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.721207 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" event={"ID":"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd","Type":"ContainerStarted","Data":"a636a0496339c0fb58170ff34b55715db6af8e4ac5ce6fd1db5551669c78c588"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.721236 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" event={"ID":"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd","Type":"ContainerStarted","Data":"a9e681aa51d97234d007d99a849ac6425a4e895b1487edcb5c9a6fe14935144f"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.722453 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.728768 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" event={"ID":"b08c305d-b9fc-4c5c-85c1-8281b9608bcf","Type":"ContainerStarted","Data":"f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.728810 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" event={"ID":"b08c305d-b9fc-4c5c-85c1-8281b9608bcf","Type":"ContainerStarted","Data":"cb30f09f65c6668eae49d8e2a5f1518ff1c19e2eb8fcc21bf1f743165319e716"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.746461 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" podStartSLOduration=4.746443417 podStartE2EDuration="4.746443417s" podCreationTimestamp="2026-03-13 13:59:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:57.743698882 +0000 UTC m=+232.745287141" watchObservedRunningTime="2026-03-13 13:59:57.746443417 +0000 UTC m=+232.748031656" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.748996 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1607f924-1e24-4848-b811-21ac3a7f8999" path="/var/lib/kubelet/pods/1607f924-1e24-4848-b811-21ac3a7f8999/volumes" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.749609 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4372b422-23c7-46bc-aec4-aef665acbda1" path="/var/lib/kubelet/pods/4372b422-23c7-46bc-aec4-aef665acbda1/volumes" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.750226 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.789724 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" podStartSLOduration=10.78970996 podStartE2EDuration="10.78970996s" podCreationTimestamp="2026-03-13 13:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:57.788358407 +0000 UTC m=+232.789946646" watchObservedRunningTime="2026-03-13 13:59:57.78970996 +0000 UTC m=+232.791298199" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.810459 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.848404 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" podStartSLOduration=166.848383299 podStartE2EDuration="2m46.848383299s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:57.825283018 +0000 UTC m=+232.826871267" watchObservedRunningTime="2026-03-13 13:59:57.848383299 +0000 UTC m=+232.849971538" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.068313 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h97c9"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.069207 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.070878 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.083265 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h97c9"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.183877 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5s4l\" (UniqueName: \"kubernetes.io/projected/f85f72a8-3887-4867-8a9c-649992ce23f1-kube-api-access-m5s4l\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.184019 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-utilities\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.184054 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-catalog-content\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.285621 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5s4l\" (UniqueName: \"kubernetes.io/projected/f85f72a8-3887-4867-8a9c-649992ce23f1-kube-api-access-m5s4l\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.285672 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-utilities\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.285688 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-catalog-content\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.286457 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-utilities\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.286520 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-catalog-content\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.306721 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5s4l\" (UniqueName: \"kubernetes.io/projected/f85f72a8-3887-4867-8a9c-649992ce23f1-kube-api-access-m5s4l\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.425964 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.470441 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hn9sl"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.471988 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.498765 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hn9sl"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.617775 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:58 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:58 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:58 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.617833 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.622243 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-utilities\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.622298 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-catalog-content\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.622342 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22rm7\" (UniqueName: \"kubernetes.io/projected/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-kube-api-access-22rm7\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.688026 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.688729 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.691405 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.691456 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.697330 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.723588 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-utilities\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.723979 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-catalog-content\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.724035 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22rm7\" (UniqueName: \"kubernetes.io/projected/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-kube-api-access-22rm7\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.724667 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-catalog-content\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.724679 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-utilities\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.741274 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.747564 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22rm7\" (UniqueName: \"kubernetes.io/projected/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-kube-api-access-22rm7\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.761269 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6dc964fb55-scb8h"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.762052 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.768422 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.768606 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.768864 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.769121 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.769171 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.769486 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.774120 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.776946 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dc964fb55-scb8h"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.824863 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b236b43-7ef1-4447-9182-2a37ee70fb95-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0b236b43-7ef1-4447-9182-2a37ee70fb95\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.824967 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b236b43-7ef1-4447-9182-2a37ee70fb95-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0b236b43-7ef1-4447-9182-2a37ee70fb95\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.838242 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.927198 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b236b43-7ef1-4447-9182-2a37ee70fb95-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0b236b43-7ef1-4447-9182-2a37ee70fb95\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.927311 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b236b43-7ef1-4447-9182-2a37ee70fb95-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0b236b43-7ef1-4447-9182-2a37ee70fb95\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.927380 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-config\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.927452 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6t65\" (UniqueName: \"kubernetes.io/projected/01ab82b4-6104-416c-a69a-b942da8e5c21-kube-api-access-p6t65\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.927508 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-client-ca\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.927563 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-proxy-ca-bundles\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.927615 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab82b4-6104-416c-a69a-b942da8e5c21-serving-cert\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.927773 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b236b43-7ef1-4447-9182-2a37ee70fb95-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0b236b43-7ef1-4447-9182-2a37ee70fb95\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.931763 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h97c9"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.953629 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b236b43-7ef1-4447-9182-2a37ee70fb95-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0b236b43-7ef1-4447-9182-2a37ee70fb95\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 13:59:58 crc kubenswrapper[4898]: W0313 13:59:58.963169 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85f72a8_3887_4867_8a9c_649992ce23f1.slice/crio-da11d51940a63fb9fd52ba5896a1fe2ba45d932b66b7a36000029b7816a483fc WatchSource:0}: Error finding container da11d51940a63fb9fd52ba5896a1fe2ba45d932b66b7a36000029b7816a483fc: Status 404 returned error can't find the container with id da11d51940a63fb9fd52ba5896a1fe2ba45d932b66b7a36000029b7816a483fc Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.019772 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.029803 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-config\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.029876 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6t65\" (UniqueName: \"kubernetes.io/projected/01ab82b4-6104-416c-a69a-b942da8e5c21-kube-api-access-p6t65\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.029928 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-client-ca\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.029970 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-proxy-ca-bundles\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.030008 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab82b4-6104-416c-a69a-b942da8e5c21-serving-cert\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.031032 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-client-ca\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.032112 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-proxy-ca-bundles\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.032759 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-config\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.046478 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab82b4-6104-416c-a69a-b942da8e5c21-serving-cert\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.049237 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6t65\" (UniqueName: \"kubernetes.io/projected/01ab82b4-6104-416c-a69a-b942da8e5c21-kube-api-access-p6t65\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.067850 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-974qp"] Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.068931 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.072535 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.076188 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-974qp"] Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.078364 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.168918 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hn9sl"] Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.232076 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-catalog-content\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.232136 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnv8v\" (UniqueName: \"kubernetes.io/projected/183d86e9-cd5c-45ed-a460-bb6169e07c72-kube-api-access-vnv8v\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.232201 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-utilities\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.334189 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-utilities\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.334734 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-catalog-content\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.334774 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnv8v\" (UniqueName: \"kubernetes.io/projected/183d86e9-cd5c-45ed-a460-bb6169e07c72-kube-api-access-vnv8v\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.336283 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-utilities\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.337596 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-catalog-content\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.367281 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnv8v\" (UniqueName: \"kubernetes.io/projected/183d86e9-cd5c-45ed-a460-bb6169e07c72-kube-api-access-vnv8v\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.385970 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.406425 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: W0313 13:59:59.409161 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0b236b43_7ef1_4447_9182_2a37ee70fb95.slice/crio-fc10576c94c994ab6a2d2cd723a2db57c1b36baa9540adf4ca4a9e2c7f87f4bd WatchSource:0}: Error finding container fc10576c94c994ab6a2d2cd723a2db57c1b36baa9540adf4ca4a9e2c7f87f4bd: Status 404 returned error can't find the container with id fc10576c94c994ab6a2d2cd723a2db57c1b36baa9540adf4ca4a9e2c7f87f4bd Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.457504 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dc964fb55-scb8h"] Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.473885 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-btkxt"] Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.475119 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.487664 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-btkxt"] Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.610370 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.614286 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:59 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:59 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:59 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.614322 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.643653 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-catalog-content\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.643722 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-utilities\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.643964 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66sn5\" (UniqueName: \"kubernetes.io/projected/7794a943-5fec-485e-86bf-f104ed6ae070-kube-api-access-66sn5\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.651425 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.651497 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.670308 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.744638 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-utilities\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.744927 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66sn5\" (UniqueName: \"kubernetes.io/projected/7794a943-5fec-485e-86bf-f104ed6ae070-kube-api-access-66sn5\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.744982 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-catalog-content\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.745657 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-utilities\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.746504 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-catalog-content\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.765650 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66sn5\" (UniqueName: \"kubernetes.io/projected/7794a943-5fec-485e-86bf-f104ed6ae070-kube-api-access-66sn5\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.771209 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0b236b43-7ef1-4447-9182-2a37ee70fb95","Type":"ContainerStarted","Data":"fc10576c94c994ab6a2d2cd723a2db57c1b36baa9540adf4ca4a9e2c7f87f4bd"} Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.779479 4898 generic.go:334] "Generic (PLEG): container finished" podID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerID="2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1" exitCode=0 Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.779619 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hn9sl" event={"ID":"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1","Type":"ContainerDied","Data":"2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1"} Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.779646 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hn9sl" event={"ID":"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1","Type":"ContainerStarted","Data":"0b8d238e1855df1df599d5c20b2f8c47368ca041ea02bd9d799ff8595124e451"} Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.809003 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" event={"ID":"01ab82b4-6104-416c-a69a-b942da8e5c21","Type":"ContainerStarted","Data":"b272539ffbdc9c2f782b3545e70e2aeb30b2bdc9248fb2dbe91338ac3f59b06a"} Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.809048 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" event={"ID":"01ab82b4-6104-416c-a69a-b942da8e5c21","Type":"ContainerStarted","Data":"bed30031d55363e7b3a1c1f5fd11eab1b82568e99697e58e8ebc0b77de6db828"} Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.809521 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.812497 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.813062 4898 patch_prober.go:28] interesting pod/controller-manager-6dc964fb55-scb8h container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.813099 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" podUID="01ab82b4-6104-416c-a69a-b942da8e5c21" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.814388 4898 generic.go:334] "Generic (PLEG): container finished" podID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerID="6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485" exitCode=0 Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.815591 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h97c9" event={"ID":"f85f72a8-3887-4867-8a9c-649992ce23f1","Type":"ContainerDied","Data":"6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485"} Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.815623 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h97c9" event={"ID":"f85f72a8-3887-4867-8a9c-649992ce23f1","Type":"ContainerStarted","Data":"da11d51940a63fb9fd52ba5896a1fe2ba45d932b66b7a36000029b7816a483fc"} Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.821100 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.827291 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" podStartSLOduration=6.827275061 podStartE2EDuration="6.827275061s" podCreationTimestamp="2026-03-13 13:59:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:59.826710677 +0000 UTC m=+234.828298926" watchObservedRunningTime="2026-03-13 13:59:59.827275061 +0000 UTC m=+234.828863300" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.869343 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-974qp"] Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.101099 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.101404 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.101134 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.101768 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.134412 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8"] Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.134642 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" podUID="f52c1025-32e7-4eba-8af4-5c5cce1918da" containerName="collect-profiles" containerID="cri-o://f1f6a6de92d72265f3f98ec19ac9f23972e5558307ce72451eab813ae76ff6a4" gracePeriod=30 Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.141976 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556840-vmqqn"] Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.142680 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.142701 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.142827 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.144615 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.145242 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h"] Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.146020 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.149731 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556840-vmqqn"] Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.155114 4898 patch_prober.go:28] interesting pod/console-f9d7485db-7l2pm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.155181 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7l2pm" podUID="0ea2e803-34d0-429b-b943-ece0b9e38b63" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.156164 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.156802 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.171744 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.179253 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h"] Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.264821 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtpqf\" (UniqueName: \"kubernetes.io/projected/c222126e-abe0-43e6-95c8-cc6946c967ae-kube-api-access-xtpqf\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.265025 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlbbz\" (UniqueName: \"kubernetes.io/projected/4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce-kube-api-access-dlbbz\") pod \"auto-csr-approver-29556840-vmqqn\" (UID: \"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce\") " pod="openshift-infra/auto-csr-approver-29556840-vmqqn" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.265068 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c222126e-abe0-43e6-95c8-cc6946c967ae-secret-volume\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.265156 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c222126e-abe0-43e6-95c8-cc6946c967ae-config-volume\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.367681 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c222126e-abe0-43e6-95c8-cc6946c967ae-config-volume\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.367741 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtpqf\" (UniqueName: \"kubernetes.io/projected/c222126e-abe0-43e6-95c8-cc6946c967ae-kube-api-access-xtpqf\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.367809 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlbbz\" (UniqueName: \"kubernetes.io/projected/4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce-kube-api-access-dlbbz\") pod \"auto-csr-approver-29556840-vmqqn\" (UID: \"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce\") " pod="openshift-infra/auto-csr-approver-29556840-vmqqn" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.367829 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c222126e-abe0-43e6-95c8-cc6946c967ae-secret-volume\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.369202 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c222126e-abe0-43e6-95c8-cc6946c967ae-config-volume\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.378681 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c222126e-abe0-43e6-95c8-cc6946c967ae-secret-volume\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.392777 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlbbz\" (UniqueName: \"kubernetes.io/projected/4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce-kube-api-access-dlbbz\") pod \"auto-csr-approver-29556840-vmqqn\" (UID: \"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce\") " pod="openshift-infra/auto-csr-approver-29556840-vmqqn" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.427019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtpqf\" (UniqueName: \"kubernetes.io/projected/c222126e-abe0-43e6-95c8-cc6946c967ae-kube-api-access-xtpqf\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.478269 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.494251 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.598224 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.600194 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.603287 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.609522 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.626281 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.629208 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:00 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:00 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:00 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.629271 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.672625 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/189d7154-fefa-48d1-b98f-5f86a30682b2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"189d7154-fefa-48d1-b98f-5f86a30682b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.672667 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/189d7154-fefa-48d1-b98f-5f86a30682b2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"189d7154-fefa-48d1-b98f-5f86a30682b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.774700 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/189d7154-fefa-48d1-b98f-5f86a30682b2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"189d7154-fefa-48d1-b98f-5f86a30682b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.774743 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/189d7154-fefa-48d1-b98f-5f86a30682b2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"189d7154-fefa-48d1-b98f-5f86a30682b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.774845 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/189d7154-fefa-48d1-b98f-5f86a30682b2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"189d7154-fefa-48d1-b98f-5f86a30682b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.802729 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/189d7154-fefa-48d1-b98f-5f86a30682b2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"189d7154-fefa-48d1-b98f-5f86a30682b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.858594 4898 generic.go:334] "Generic (PLEG): container finished" podID="f52c1025-32e7-4eba-8af4-5c5cce1918da" containerID="f1f6a6de92d72265f3f98ec19ac9f23972e5558307ce72451eab813ae76ff6a4" exitCode=0 Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.858668 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" event={"ID":"f52c1025-32e7-4eba-8af4-5c5cce1918da","Type":"ContainerDied","Data":"f1f6a6de92d72265f3f98ec19ac9f23972e5558307ce72451eab813ae76ff6a4"} Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.862015 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0b236b43-7ef1-4447-9182-2a37ee70fb95","Type":"ContainerStarted","Data":"3b906d35fdabd57736b30e79e8733ee02d43842fe161c7f4d822cd34e8ed4f5a"} Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.870594 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.871475 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.923772 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:01 crc kubenswrapper[4898]: I0313 14:00:01.006847 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.006824703 podStartE2EDuration="3.006824703s" podCreationTimestamp="2026-03-13 13:59:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:00:01.003541804 +0000 UTC m=+236.005130043" watchObservedRunningTime="2026-03-13 14:00:01.006824703 +0000 UTC m=+236.008412942" Mar 13 14:00:01 crc kubenswrapper[4898]: I0313 14:00:01.617373 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:01 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:01 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:01 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:01 crc kubenswrapper[4898]: I0313 14:00:01.617443 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:01 crc kubenswrapper[4898]: I0313 14:00:01.892258 4898 generic.go:334] "Generic (PLEG): container finished" podID="0b236b43-7ef1-4447-9182-2a37ee70fb95" containerID="3b906d35fdabd57736b30e79e8733ee02d43842fe161c7f4d822cd34e8ed4f5a" exitCode=0 Mar 13 14:00:01 crc kubenswrapper[4898]: I0313 14:00:01.893096 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0b236b43-7ef1-4447-9182-2a37ee70fb95","Type":"ContainerDied","Data":"3b906d35fdabd57736b30e79e8733ee02d43842fe161c7f4d822cd34e8ed4f5a"} Mar 13 14:00:01 crc kubenswrapper[4898]: I0313 14:00:01.934536 4898 ???:1] "http: TLS handshake error from 192.168.126.11:49624: no serving certificate available for the kubelet" Mar 13 14:00:02 crc kubenswrapper[4898]: I0313 14:00:02.612050 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:02 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:02 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:02 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:02 crc kubenswrapper[4898]: I0313 14:00:02.612136 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:03 crc kubenswrapper[4898]: I0313 14:00:03.612421 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:03 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:03 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:03 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:03 crc kubenswrapper[4898]: I0313 14:00:03.612488 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:04 crc kubenswrapper[4898]: I0313 14:00:04.612435 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:04 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:04 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:04 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:04 crc kubenswrapper[4898]: I0313 14:00:04.612502 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:05 crc kubenswrapper[4898]: I0313 14:00:05.474429 4898 ???:1] "http: TLS handshake error from 192.168.126.11:49636: no serving certificate available for the kubelet" Mar 13 14:00:05 crc kubenswrapper[4898]: I0313 14:00:05.511689 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hqcs6" Mar 13 14:00:05 crc kubenswrapper[4898]: I0313 14:00:05.612950 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:05 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:05 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:05 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:05 crc kubenswrapper[4898]: I0313 14:00:05.613017 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:06 crc kubenswrapper[4898]: I0313 14:00:06.612957 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:06 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:06 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:06 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:06 crc kubenswrapper[4898]: I0313 14:00:06.613314 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:07 crc kubenswrapper[4898]: W0313 14:00:07.499088 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod183d86e9_cd5c_45ed_a460_bb6169e07c72.slice/crio-3597e8f057d81527e3da3a21c0723e2cabe95896c1c2879fe09fef6825e9aab7 WatchSource:0}: Error finding container 3597e8f057d81527e3da3a21c0723e2cabe95896c1c2879fe09fef6825e9aab7: Status 404 returned error can't find the container with id 3597e8f057d81527e3da3a21c0723e2cabe95896c1c2879fe09fef6825e9aab7 Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.537010 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.613223 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:07 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:07 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:07 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.613290 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.676168 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b236b43-7ef1-4447-9182-2a37ee70fb95-kube-api-access\") pod \"0b236b43-7ef1-4447-9182-2a37ee70fb95\" (UID: \"0b236b43-7ef1-4447-9182-2a37ee70fb95\") " Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.676372 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b236b43-7ef1-4447-9182-2a37ee70fb95-kubelet-dir\") pod \"0b236b43-7ef1-4447-9182-2a37ee70fb95\" (UID: \"0b236b43-7ef1-4447-9182-2a37ee70fb95\") " Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.676873 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b236b43-7ef1-4447-9182-2a37ee70fb95-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0b236b43-7ef1-4447-9182-2a37ee70fb95" (UID: "0b236b43-7ef1-4447-9182-2a37ee70fb95"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.677580 4898 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b236b43-7ef1-4447-9182-2a37ee70fb95-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.687238 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b236b43-7ef1-4447-9182-2a37ee70fb95-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b236b43-7ef1-4447-9182-2a37ee70fb95" (UID: "0b236b43-7ef1-4447-9182-2a37ee70fb95"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.779126 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b236b43-7ef1-4447-9182-2a37ee70fb95-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.944365 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-974qp" event={"ID":"183d86e9-cd5c-45ed-a460-bb6169e07c72","Type":"ContainerStarted","Data":"3597e8f057d81527e3da3a21c0723e2cabe95896c1c2879fe09fef6825e9aab7"} Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.946074 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0b236b43-7ef1-4447-9182-2a37ee70fb95","Type":"ContainerDied","Data":"fc10576c94c994ab6a2d2cd723a2db57c1b36baa9540adf4ca4a9e2c7f87f4bd"} Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.946128 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc10576c94c994ab6a2d2cd723a2db57c1b36baa9540adf4ca4a9e2c7f87f4bd" Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.946128 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 14:00:08 crc kubenswrapper[4898]: I0313 14:00:08.613785 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:08 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:08 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:08 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:08 crc kubenswrapper[4898]: I0313 14:00:08.613851 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:09 crc kubenswrapper[4898]: I0313 14:00:09.612115 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:09 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:09 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:09 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:09 crc kubenswrapper[4898]: I0313 14:00:09.612412 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:09 crc kubenswrapper[4898]: I0313 14:00:09.909891 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 14:00:09 crc kubenswrapper[4898]: I0313 14:00:09.911785 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 14:00:09 crc kubenswrapper[4898]: I0313 14:00:09.927078 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 14:00:09 crc kubenswrapper[4898]: I0313 14:00:09.958523 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 14:00:09 crc kubenswrapper[4898]: I0313 14:00:09.965815 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 14:00:10 crc kubenswrapper[4898]: I0313 14:00:10.104891 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-cx59b" Mar 13 14:00:10 crc kubenswrapper[4898]: I0313 14:00:10.145747 4898 patch_prober.go:28] interesting pod/console-f9d7485db-7l2pm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 13 14:00:10 crc kubenswrapper[4898]: I0313 14:00:10.146121 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7l2pm" podUID="0ea2e803-34d0-429b-b943-ece0b9e38b63" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 13 14:00:10 crc kubenswrapper[4898]: I0313 14:00:10.613849 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:10 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:10 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:10 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:10 crc kubenswrapper[4898]: I0313 14:00:10.614190 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:11 crc kubenswrapper[4898]: I0313 14:00:11.628189 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:11 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:11 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:11 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:11 crc kubenswrapper[4898]: I0313 14:00:11.628527 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:12 crc kubenswrapper[4898]: I0313 14:00:12.195007 4898 ???:1] "http: TLS handshake error from 192.168.126.11:51012: no serving certificate available for the kubelet" Mar 13 14:00:12 crc kubenswrapper[4898]: I0313 14:00:12.613559 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:12 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:12 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:12 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:12 crc kubenswrapper[4898]: I0313 14:00:12.613655 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:12 crc kubenswrapper[4898]: I0313 14:00:12.925739 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dc964fb55-scb8h"] Mar 13 14:00:12 crc kubenswrapper[4898]: I0313 14:00:12.926013 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" podUID="01ab82b4-6104-416c-a69a-b942da8e5c21" containerName="controller-manager" containerID="cri-o://b272539ffbdc9c2f782b3545e70e2aeb30b2bdc9248fb2dbe91338ac3f59b06a" gracePeriod=30 Mar 13 14:00:12 crc kubenswrapper[4898]: I0313 14:00:12.946082 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq"] Mar 13 14:00:12 crc kubenswrapper[4898]: I0313 14:00:12.946316 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" podUID="07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" containerName="route-controller-manager" containerID="cri-o://a636a0496339c0fb58170ff34b55715db6af8e4ac5ce6fd1db5551669c78c588" gracePeriod=30 Mar 13 14:00:13 crc kubenswrapper[4898]: I0313 14:00:13.612592 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:13 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:13 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:13 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:13 crc kubenswrapper[4898]: I0313 14:00:13.612660 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:13 crc kubenswrapper[4898]: I0313 14:00:13.978510 4898 generic.go:334] "Generic (PLEG): container finished" podID="07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" containerID="a636a0496339c0fb58170ff34b55715db6af8e4ac5ce6fd1db5551669c78c588" exitCode=0 Mar 13 14:00:13 crc kubenswrapper[4898]: I0313 14:00:13.978577 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" event={"ID":"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd","Type":"ContainerDied","Data":"a636a0496339c0fb58170ff34b55715db6af8e4ac5ce6fd1db5551669c78c588"} Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.438014 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.584581 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f52c1025-32e7-4eba-8af4-5c5cce1918da-config-volume\") pod \"f52c1025-32e7-4eba-8af4-5c5cce1918da\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.585311 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f52c1025-32e7-4eba-8af4-5c5cce1918da-secret-volume\") pod \"f52c1025-32e7-4eba-8af4-5c5cce1918da\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.585454 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqn5q\" (UniqueName: \"kubernetes.io/projected/f52c1025-32e7-4eba-8af4-5c5cce1918da-kube-api-access-bqn5q\") pod \"f52c1025-32e7-4eba-8af4-5c5cce1918da\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.586986 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f52c1025-32e7-4eba-8af4-5c5cce1918da-config-volume" (OuterVolumeSpecName: "config-volume") pod "f52c1025-32e7-4eba-8af4-5c5cce1918da" (UID: "f52c1025-32e7-4eba-8af4-5c5cce1918da"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.592546 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52c1025-32e7-4eba-8af4-5c5cce1918da-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f52c1025-32e7-4eba-8af4-5c5cce1918da" (UID: "f52c1025-32e7-4eba-8af4-5c5cce1918da"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.593481 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f52c1025-32e7-4eba-8af4-5c5cce1918da-kube-api-access-bqn5q" (OuterVolumeSpecName: "kube-api-access-bqn5q") pod "f52c1025-32e7-4eba-8af4-5c5cce1918da" (UID: "f52c1025-32e7-4eba-8af4-5c5cce1918da"). InnerVolumeSpecName "kube-api-access-bqn5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.618422 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.621072 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.687669 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f52c1025-32e7-4eba-8af4-5c5cce1918da-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.687703 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqn5q\" (UniqueName: \"kubernetes.io/projected/f52c1025-32e7-4eba-8af4-5c5cce1918da-kube-api-access-bqn5q\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.687716 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f52c1025-32e7-4eba-8af4-5c5cce1918da-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:15 crc kubenswrapper[4898]: I0313 14:00:14.996422 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" event={"ID":"f52c1025-32e7-4eba-8af4-5c5cce1918da","Type":"ContainerDied","Data":"423e6617bc8cc210d91629b1d5580f9b4f8c3137b80892ad74315408fb41680c"} Mar 13 14:00:15 crc kubenswrapper[4898]: I0313 14:00:14.996480 4898 scope.go:117] "RemoveContainer" containerID="f1f6a6de92d72265f3f98ec19ac9f23972e5558307ce72451eab813ae76ff6a4" Mar 13 14:00:15 crc kubenswrapper[4898]: I0313 14:00:14.996604 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 14:00:15 crc kubenswrapper[4898]: I0313 14:00:15.001283 4898 generic.go:334] "Generic (PLEG): container finished" podID="01ab82b4-6104-416c-a69a-b942da8e5c21" containerID="b272539ffbdc9c2f782b3545e70e2aeb30b2bdc9248fb2dbe91338ac3f59b06a" exitCode=0 Mar 13 14:00:15 crc kubenswrapper[4898]: I0313 14:00:15.001359 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" event={"ID":"01ab82b4-6104-416c-a69a-b942da8e5c21","Type":"ContainerDied","Data":"b272539ffbdc9c2f782b3545e70e2aeb30b2bdc9248fb2dbe91338ac3f59b06a"} Mar 13 14:00:15 crc kubenswrapper[4898]: I0313 14:00:15.025170 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8"] Mar 13 14:00:15 crc kubenswrapper[4898]: I0313 14:00:15.031269 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8"] Mar 13 14:00:15 crc kubenswrapper[4898]: I0313 14:00:15.747738 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f52c1025-32e7-4eba-8af4-5c5cce1918da" path="/var/lib/kubelet/pods/f52c1025-32e7-4eba-8af4-5c5cce1918da/volumes" Mar 13 14:00:17 crc kubenswrapper[4898]: I0313 14:00:17.208945 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 14:00:17 crc kubenswrapper[4898]: I0313 14:00:17.546565 4898 patch_prober.go:28] interesting pod/route-controller-manager-779788b65f-vkvqq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 14:00:17 crc kubenswrapper[4898]: I0313 14:00:17.546636 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" podUID="07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 14:00:18 crc kubenswrapper[4898]: I0313 14:00:18.092747 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 14:00:19 crc kubenswrapper[4898]: I0313 14:00:19.134124 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:00:19 crc kubenswrapper[4898]: I0313 14:00:19.134216 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:00:20 crc kubenswrapper[4898]: I0313 14:00:20.080311 4898 patch_prober.go:28] interesting pod/controller-manager-6dc964fb55-scb8h container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 14:00:20 crc kubenswrapper[4898]: I0313 14:00:20.080629 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" podUID="01ab82b4-6104-416c-a69a-b942da8e5c21" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 14:00:20 crc kubenswrapper[4898]: I0313 14:00:20.168722 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 14:00:20 crc kubenswrapper[4898]: I0313 14:00:20.172709 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.839951 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.846148 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.877440 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64446bcfb4-56ccg"] Mar 13 14:00:26 crc kubenswrapper[4898]: E0313 14:00:26.877967 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b236b43-7ef1-4447-9182-2a37ee70fb95" containerName="pruner" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878003 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b236b43-7ef1-4447-9182-2a37ee70fb95" containerName="pruner" Mar 13 14:00:26 crc kubenswrapper[4898]: E0313 14:00:26.878018 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52c1025-32e7-4eba-8af4-5c5cce1918da" containerName="collect-profiles" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878025 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52c1025-32e7-4eba-8af4-5c5cce1918da" containerName="collect-profiles" Mar 13 14:00:26 crc kubenswrapper[4898]: E0313 14:00:26.878036 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ab82b4-6104-416c-a69a-b942da8e5c21" containerName="controller-manager" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878041 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ab82b4-6104-416c-a69a-b942da8e5c21" containerName="controller-manager" Mar 13 14:00:26 crc kubenswrapper[4898]: E0313 14:00:26.878053 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" containerName="route-controller-manager" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878060 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" containerName="route-controller-manager" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878194 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b236b43-7ef1-4447-9182-2a37ee70fb95" containerName="pruner" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878204 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" containerName="route-controller-manager" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878243 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ab82b4-6104-416c-a69a-b942da8e5c21" containerName="controller-manager" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878253 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52c1025-32e7-4eba-8af4-5c5cce1918da" containerName="collect-profiles" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878712 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64446bcfb4-56ccg"] Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878822 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961272 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-proxy-ca-bundles\") pod \"01ab82b4-6104-416c-a69a-b942da8e5c21\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961335 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-client-ca\") pod \"01ab82b4-6104-416c-a69a-b942da8e5c21\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961379 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-client-ca\") pod \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961402 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-config\") pod \"01ab82b4-6104-416c-a69a-b942da8e5c21\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961475 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6t65\" (UniqueName: \"kubernetes.io/projected/01ab82b4-6104-416c-a69a-b942da8e5c21-kube-api-access-p6t65\") pod \"01ab82b4-6104-416c-a69a-b942da8e5c21\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961495 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-serving-cert\") pod \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961523 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4gbm\" (UniqueName: \"kubernetes.io/projected/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-kube-api-access-n4gbm\") pod \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961550 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-config\") pod \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961593 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab82b4-6104-416c-a69a-b942da8e5c21-serving-cert\") pod \"01ab82b4-6104-416c-a69a-b942da8e5c21\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961773 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fa4a89-d754-4f84-80be-a552772613dc-serving-cert\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961809 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-config\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961887 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk7gg\" (UniqueName: \"kubernetes.io/projected/a9fa4a89-d754-4f84-80be-a552772613dc-kube-api-access-jk7gg\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961950 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-client-ca\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961971 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-proxy-ca-bundles\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.963402 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-client-ca" (OuterVolumeSpecName: "client-ca") pod "07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" (UID: "07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.963447 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-client-ca" (OuterVolumeSpecName: "client-ca") pod "01ab82b4-6104-416c-a69a-b942da8e5c21" (UID: "01ab82b4-6104-416c-a69a-b942da8e5c21"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.963502 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-config" (OuterVolumeSpecName: "config") pod "01ab82b4-6104-416c-a69a-b942da8e5c21" (UID: "01ab82b4-6104-416c-a69a-b942da8e5c21"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.963542 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "01ab82b4-6104-416c-a69a-b942da8e5c21" (UID: "01ab82b4-6104-416c-a69a-b942da8e5c21"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.963866 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-config" (OuterVolumeSpecName: "config") pod "07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" (UID: "07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.967504 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-kube-api-access-n4gbm" (OuterVolumeSpecName: "kube-api-access-n4gbm") pod "07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" (UID: "07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd"). InnerVolumeSpecName "kube-api-access-n4gbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.967633 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab82b4-6104-416c-a69a-b942da8e5c21-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab82b4-6104-416c-a69a-b942da8e5c21" (UID: "01ab82b4-6104-416c-a69a-b942da8e5c21"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.967800 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab82b4-6104-416c-a69a-b942da8e5c21-kube-api-access-p6t65" (OuterVolumeSpecName: "kube-api-access-p6t65") pod "01ab82b4-6104-416c-a69a-b942da8e5c21" (UID: "01ab82b4-6104-416c-a69a-b942da8e5c21"). InnerVolumeSpecName "kube-api-access-p6t65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.968349 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" (UID: "07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.064056 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk7gg\" (UniqueName: \"kubernetes.io/projected/a9fa4a89-d754-4f84-80be-a552772613dc-kube-api-access-jk7gg\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.064460 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-client-ca\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.064650 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-proxy-ca-bundles\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.066322 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-client-ca\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.066552 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-proxy-ca-bundles\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.067198 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fa4a89-d754-4f84-80be-a552772613dc-serving-cert\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.067486 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-config\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.071686 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.071919 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.072059 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6t65\" (UniqueName: \"kubernetes.io/projected/01ab82b4-6104-416c-a69a-b942da8e5c21-kube-api-access-p6t65\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.072205 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.072319 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4gbm\" (UniqueName: \"kubernetes.io/projected/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-kube-api-access-n4gbm\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.072456 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.072574 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab82b4-6104-416c-a69a-b942da8e5c21-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.072699 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.072831 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.069723 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-config\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.073394 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.073619 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" event={"ID":"01ab82b4-6104-416c-a69a-b942da8e5c21","Type":"ContainerDied","Data":"bed30031d55363e7b3a1c1f5fd11eab1b82568e99697e58e8ebc0b77de6db828"} Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.075078 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" event={"ID":"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd","Type":"ContainerDied","Data":"a9e681aa51d97234d007d99a849ac6425a4e895b1487edcb5c9a6fe14935144f"} Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.075257 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.083663 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fa4a89-d754-4f84-80be-a552772613dc-serving-cert\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.092340 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk7gg\" (UniqueName: \"kubernetes.io/projected/a9fa4a89-d754-4f84-80be-a552772613dc-kube-api-access-jk7gg\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.124631 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dc964fb55-scb8h"] Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.128433 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6dc964fb55-scb8h"] Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.140665 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq"] Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.143302 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq"] Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.199741 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.546589 4898 patch_prober.go:28] interesting pod/route-controller-manager-779788b65f-vkvqq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.546676 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" podUID="07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.747270 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab82b4-6104-416c-a69a-b942da8e5c21" path="/var/lib/kubelet/pods/01ab82b4-6104-416c-a69a-b942da8e5c21/volumes" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.747870 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" path="/var/lib/kubelet/pods/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd/volumes" Mar 13 14:00:27 crc kubenswrapper[4898]: E0313 14:00:27.794122 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 13 14:00:27 crc kubenswrapper[4898]: E0313 14:00:27.794360 4898 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 14:00:27 crc kubenswrapper[4898]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 13 14:00:27 crc kubenswrapper[4898]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c7p5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29556838-h7pkr_openshift-infra(aa1ed4c8-e4bd-4352-bee3-404f16244ea3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 13 14:00:27 crc kubenswrapper[4898]: > logger="UnhandledError" Mar 13 14:00:27 crc kubenswrapper[4898]: E0313 14:00:27.795537 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" podUID="aa1ed4c8-e4bd-4352-bee3-404f16244ea3" Mar 13 14:00:28 crc kubenswrapper[4898]: E0313 14:00:28.081522 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" podUID="aa1ed4c8-e4bd-4352-bee3-404f16244ea3" Mar 13 14:00:29 crc kubenswrapper[4898]: E0313 14:00:29.796148 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 13 14:00:29 crc kubenswrapper[4898]: E0313 14:00:29.796311 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5nn9n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-xh84s_openshift-marketplace(4ae77efc-55ca-4eee-8817-9c21d0bafa6e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 14:00:29 crc kubenswrapper[4898]: E0313 14:00:29.797530 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-xh84s" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" Mar 13 14:00:30 crc kubenswrapper[4898]: I0313 14:00:30.717457 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 14:00:31 crc kubenswrapper[4898]: E0313 14:00:31.736479 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-xh84s" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.785526 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k"] Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.796477 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.800555 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.803270 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.813094 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.813623 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.822262 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.822420 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.827564 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k"] Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.828039 4898 scope.go:117] "RemoveContainer" containerID="b272539ffbdc9c2f782b3545e70e2aeb30b2bdc9248fb2dbe91338ac3f59b06a" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.851562 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-client-ca\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.851863 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lszm\" (UniqueName: \"kubernetes.io/projected/c46150e0-fd12-4e99-8de9-82630b55487b-kube-api-access-9lszm\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.852023 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-config\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.852205 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c46150e0-fd12-4e99-8de9-82630b55487b-serving-cert\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.954007 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lszm\" (UniqueName: \"kubernetes.io/projected/c46150e0-fd12-4e99-8de9-82630b55487b-kube-api-access-9lszm\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.954358 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-config\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.957816 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c46150e0-fd12-4e99-8de9-82630b55487b-serving-cert\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.957850 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-client-ca\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.957692 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-config\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.958862 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-client-ca\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.964844 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c46150e0-fd12-4e99-8de9-82630b55487b-serving-cert\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.974557 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lszm\" (UniqueName: \"kubernetes.io/projected/c46150e0-fd12-4e99-8de9-82630b55487b-kube-api-access-9lszm\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:32 crc kubenswrapper[4898]: I0313 14:00:32.005565 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h"] Mar 13 14:00:32 crc kubenswrapper[4898]: I0313 14:00:32.048124 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-btkxt"] Mar 13 14:00:32 crc kubenswrapper[4898]: I0313 14:00:32.152375 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.220103 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.220271 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5x728,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-twh8h_openshift-marketplace(8f81bcfc-3c35-48e8-a584-961351e8c0e2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.221830 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-twh8h" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" Mar 13 14:00:32 crc kubenswrapper[4898]: I0313 14:00:32.275414 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556840-vmqqn"] Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.554057 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.554246 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x7vmd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ppq6v_openshift-marketplace(a990881e-0caf-4096-a372-4cdad69006c1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.555605 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ppq6v" podUID="a990881e-0caf-4096-a372-4cdad69006c1" Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.739047 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.739204 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zhlq2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dvvz2_openshift-marketplace(43acaee8-efc8-4156-b28c-b493f241ac53): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.740407 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dvvz2" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" Mar 13 14:00:32 crc kubenswrapper[4898]: I0313 14:00:32.916738 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64446bcfb4-56ccg"] Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.017758 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k"] Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.198957 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.200068 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.207818 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.278108 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/111e79bc-00ab-488b-8d9d-862ce8581fa9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"111e79bc-00ab-488b-8d9d-862ce8581fa9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.278184 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/111e79bc-00ab-488b-8d9d-862ce8581fa9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"111e79bc-00ab-488b-8d9d-862ce8581fa9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.379429 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/111e79bc-00ab-488b-8d9d-862ce8581fa9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"111e79bc-00ab-488b-8d9d-862ce8581fa9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.379508 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/111e79bc-00ab-488b-8d9d-862ce8581fa9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"111e79bc-00ab-488b-8d9d-862ce8581fa9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.379558 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/111e79bc-00ab-488b-8d9d-862ce8581fa9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"111e79bc-00ab-488b-8d9d-862ce8581fa9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.412132 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/111e79bc-00ab-488b-8d9d-862ce8581fa9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"111e79bc-00ab-488b-8d9d-862ce8581fa9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.530988 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:34 crc kubenswrapper[4898]: E0313 14:00:34.990149 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-twh8h" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" Mar 13 14:00:34 crc kubenswrapper[4898]: E0313 14:00:34.990174 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dvvz2" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" Mar 13 14:00:34 crc kubenswrapper[4898]: E0313 14:00:34.990385 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ppq6v" podUID="a990881e-0caf-4096-a372-4cdad69006c1" Mar 13 14:00:34 crc kubenswrapper[4898]: W0313 14:00:34.996734 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc222126e_abe0_43e6_95c8_cc6946c967ae.slice/crio-a52e531bb6ba827a038762b2905cd91f8d32e0f4df6accde072de13447061bee WatchSource:0}: Error finding container a52e531bb6ba827a038762b2905cd91f8d32e0f4df6accde072de13447061bee: Status 404 returned error can't find the container with id a52e531bb6ba827a038762b2905cd91f8d32e0f4df6accde072de13447061bee Mar 13 14:00:35 crc kubenswrapper[4898]: W0313 14:00:35.002663 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7794a943_5fec_485e_86bf_f104ed6ae070.slice/crio-9107b24c316c7c4b1a47858e8c7bbd33ee4b48e192b020a209e38129a6fd6f89 WatchSource:0}: Error finding container 9107b24c316c7c4b1a47858e8c7bbd33ee4b48e192b020a209e38129a6fd6f89: Status 404 returned error can't find the container with id 9107b24c316c7c4b1a47858e8c7bbd33ee4b48e192b020a209e38129a6fd6f89 Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.035098 4898 scope.go:117] "RemoveContainer" containerID="a636a0496339c0fb58170ff34b55715db6af8e4ac5ce6fd1db5551669c78c588" Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.116529 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-btkxt" event={"ID":"7794a943-5fec-485e-86bf-f104ed6ae070","Type":"ContainerStarted","Data":"9107b24c316c7c4b1a47858e8c7bbd33ee4b48e192b020a209e38129a6fd6f89"} Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.127448 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" event={"ID":"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce","Type":"ContainerStarted","Data":"4cdc004944646e848df2358e59a867264f56b2a1a5573319599edf0e300fc922"} Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.134287 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" event={"ID":"c222126e-abe0-43e6-95c8-cc6946c967ae","Type":"ContainerStarted","Data":"a52e531bb6ba827a038762b2905cd91f8d32e0f4df6accde072de13447061bee"} Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.250128 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 14:00:35 crc kubenswrapper[4898]: W0313 14:00:35.258207 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod189d7154_fefa_48d1_b98f_5f86a30682b2.slice/crio-687b75fd7d1c7201c0a5fe3f1524aea3bd6ecd8ac0281603109401585ffb43a0 WatchSource:0}: Error finding container 687b75fd7d1c7201c0a5fe3f1524aea3bd6ecd8ac0281603109401585ffb43a0: Status 404 returned error can't find the container with id 687b75fd7d1c7201c0a5fe3f1524aea3bd6ecd8ac0281603109401585ffb43a0 Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.389014 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fwrwc"] Mar 13 14:00:35 crc kubenswrapper[4898]: W0313 14:00:35.393259 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d1f9d54_7cbb_4233_b3ee_b8d5dfa42869.slice/crio-80e9d51772a8debf3118899299439977e87466dd200b3a2b95ff04013f9825c1 WatchSource:0}: Error finding container 80e9d51772a8debf3118899299439977e87466dd200b3a2b95ff04013f9825c1: Status 404 returned error can't find the container with id 80e9d51772a8debf3118899299439977e87466dd200b3a2b95ff04013f9825c1 Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.517046 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64446bcfb4-56ccg"] Mar 13 14:00:35 crc kubenswrapper[4898]: W0313 14:00:35.524012 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9fa4a89_d754_4f84_80be_a552772613dc.slice/crio-752fe552b8c17ef86982e87196efe9daa90c3035620e4b9c932c71cb6e722835 WatchSource:0}: Error finding container 752fe552b8c17ef86982e87196efe9daa90c3035620e4b9c932c71cb6e722835: Status 404 returned error can't find the container with id 752fe552b8c17ef86982e87196efe9daa90c3035620e4b9c932c71cb6e722835 Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.551042 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k"] Mar 13 14:00:35 crc kubenswrapper[4898]: W0313 14:00:35.558877 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc46150e0_fd12_4e99_8de9_82630b55487b.slice/crio-4555d9ea560dda62fa3bb4e1189ec322eb21486dee3eb5d0e98385448c4e0a25 WatchSource:0}: Error finding container 4555d9ea560dda62fa3bb4e1189ec322eb21486dee3eb5d0e98385448c4e0a25: Status 404 returned error can't find the container with id 4555d9ea560dda62fa3bb4e1189ec322eb21486dee3eb5d0e98385448c4e0a25 Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.628804 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.148201 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" event={"ID":"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869","Type":"ContainerStarted","Data":"e4f9bc6b836d4b96018e048cdc85f5c276e4cfb89dda989403045d6c5ce31f83"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.148256 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" event={"ID":"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869","Type":"ContainerStarted","Data":"80e9d51772a8debf3118899299439977e87466dd200b3a2b95ff04013f9825c1"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.149832 4898 generic.go:334] "Generic (PLEG): container finished" podID="c222126e-abe0-43e6-95c8-cc6946c967ae" containerID="dac7072f1900557a02d6c49c5a63ec387e9ac4b9e0b548d071acd63216fda826" exitCode=0 Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.149877 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" event={"ID":"c222126e-abe0-43e6-95c8-cc6946c967ae","Type":"ContainerDied","Data":"dac7072f1900557a02d6c49c5a63ec387e9ac4b9e0b548d071acd63216fda826"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.153285 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"111e79bc-00ab-488b-8d9d-862ce8581fa9","Type":"ContainerStarted","Data":"b6436c5b6d8f152e7a64185eb0822e10182ef56f7e00b5c3cd7e12ec9274c737"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.153341 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"111e79bc-00ab-488b-8d9d-862ce8581fa9","Type":"ContainerStarted","Data":"f600f28d004fed11e321fdd1224ecfbcb4d5947d8d946b747518b734300fe345"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.156583 4898 generic.go:334] "Generic (PLEG): container finished" podID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerID="7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248" exitCode=0 Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.156952 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-974qp" event={"ID":"183d86e9-cd5c-45ed-a460-bb6169e07c72","Type":"ContainerDied","Data":"7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.164698 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"189d7154-fefa-48d1-b98f-5f86a30682b2","Type":"ContainerStarted","Data":"e650d82e1f53899ecfc7c509cbbcdd64f6c94d3e168b7097ea78182782e3bef2"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.164754 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"189d7154-fefa-48d1-b98f-5f86a30682b2","Type":"ContainerStarted","Data":"687b75fd7d1c7201c0a5fe3f1524aea3bd6ecd8ac0281603109401585ffb43a0"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.170080 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" event={"ID":"c46150e0-fd12-4e99-8de9-82630b55487b","Type":"ContainerStarted","Data":"f5ee2f06681290cd6019466610f90492ac1d27b61c3786e195c3276f8b9bc87f"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.170130 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" event={"ID":"c46150e0-fd12-4e99-8de9-82630b55487b","Type":"ContainerStarted","Data":"4555d9ea560dda62fa3bb4e1189ec322eb21486dee3eb5d0e98385448c4e0a25"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.170250 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" podUID="c46150e0-fd12-4e99-8de9-82630b55487b" containerName="route-controller-manager" containerID="cri-o://f5ee2f06681290cd6019466610f90492ac1d27b61c3786e195c3276f8b9bc87f" gracePeriod=30 Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.170593 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.182600 4898 generic.go:334] "Generic (PLEG): container finished" podID="7794a943-5fec-485e-86bf-f104ed6ae070" containerID="b05ab9f2e4156ffc544ae5bf8d297fc15f45604caf9f75b4b1a59e033d78a2fc" exitCode=0 Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.182709 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-btkxt" event={"ID":"7794a943-5fec-485e-86bf-f104ed6ae070","Type":"ContainerDied","Data":"b05ab9f2e4156ffc544ae5bf8d297fc15f45604caf9f75b4b1a59e033d78a2fc"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.185023 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=36.185009494 podStartE2EDuration="36.185009494s" podCreationTimestamp="2026-03-13 14:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:00:36.182944855 +0000 UTC m=+271.184533104" watchObservedRunningTime="2026-03-13 14:00:36.185009494 +0000 UTC m=+271.186597743" Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.186388 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" event={"ID":"a9fa4a89-d754-4f84-80be-a552772613dc","Type":"ContainerStarted","Data":"5450739f432e17c50d2eee20e629b8170cfc52fa713a03177856d3eacd247c1a"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.186427 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" event={"ID":"a9fa4a89-d754-4f84-80be-a552772613dc","Type":"ContainerStarted","Data":"752fe552b8c17ef86982e87196efe9daa90c3035620e4b9c932c71cb6e722835"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.186543 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" podUID="a9fa4a89-d754-4f84-80be-a552772613dc" containerName="controller-manager" containerID="cri-o://5450739f432e17c50d2eee20e629b8170cfc52fa713a03177856d3eacd247c1a" gracePeriod=30 Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.187166 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.199912 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.252512 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" podStartSLOduration=24.252497334 podStartE2EDuration="24.252497334s" podCreationTimestamp="2026-03-13 14:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:00:36.24897334 +0000 UTC m=+271.250561609" watchObservedRunningTime="2026-03-13 14:00:36.252497334 +0000 UTC m=+271.254085573" Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.275770 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" podStartSLOduration=24.275749129 podStartE2EDuration="24.275749129s" podCreationTimestamp="2026-03-13 14:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:00:36.275284058 +0000 UTC m=+271.276872337" watchObservedRunningTime="2026-03-13 14:00:36.275749129 +0000 UTC m=+271.277337368" Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.409103 4898 patch_prober.go:28] interesting pod/route-controller-manager-786d64999b-pd42k container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": read tcp 10.217.0.2:41770->10.217.0.61:8443: read: connection reset by peer" start-of-body= Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.409152 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" podUID="c46150e0-fd12-4e99-8de9-82630b55487b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": read tcp 10.217.0.2:41770->10.217.0.61:8443: read: connection reset by peer" Mar 13 14:00:37 crc kubenswrapper[4898]: E0313 14:00:37.144842 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 13 14:00:37 crc kubenswrapper[4898]: E0313 14:00:37.145036 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-22rm7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hn9sl_openshift-marketplace(b8bc0c30-71e1-41d2-8991-1ce9d85d50a1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 14:00:37 crc kubenswrapper[4898]: E0313 14:00:37.146643 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hn9sl" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.192698 4898 generic.go:334] "Generic (PLEG): container finished" podID="a9fa4a89-d754-4f84-80be-a552772613dc" containerID="5450739f432e17c50d2eee20e629b8170cfc52fa713a03177856d3eacd247c1a" exitCode=0 Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.192755 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" event={"ID":"a9fa4a89-d754-4f84-80be-a552772613dc","Type":"ContainerDied","Data":"5450739f432e17c50d2eee20e629b8170cfc52fa713a03177856d3eacd247c1a"} Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.193970 4898 generic.go:334] "Generic (PLEG): container finished" podID="189d7154-fefa-48d1-b98f-5f86a30682b2" containerID="e650d82e1f53899ecfc7c509cbbcdd64f6c94d3e168b7097ea78182782e3bef2" exitCode=0 Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.194029 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"189d7154-fefa-48d1-b98f-5f86a30682b2","Type":"ContainerDied","Data":"e650d82e1f53899ecfc7c509cbbcdd64f6c94d3e168b7097ea78182782e3bef2"} Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.196282 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-786d64999b-pd42k_c46150e0-fd12-4e99-8de9-82630b55487b/route-controller-manager/0.log" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.196410 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" event={"ID":"c46150e0-fd12-4e99-8de9-82630b55487b","Type":"ContainerDied","Data":"f5ee2f06681290cd6019466610f90492ac1d27b61c3786e195c3276f8b9bc87f"} Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.196389 4898 generic.go:334] "Generic (PLEG): container finished" podID="c46150e0-fd12-4e99-8de9-82630b55487b" containerID="f5ee2f06681290cd6019466610f90492ac1d27b61c3786e195c3276f8b9bc87f" exitCode=255 Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.198942 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" event={"ID":"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869","Type":"ContainerStarted","Data":"055dbff8a6f94d2747c73a8e2c33b0297f8dcaa3ef6f92b25fd67ee7af230e94"} Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.200072 4898 generic.go:334] "Generic (PLEG): container finished" podID="111e79bc-00ab-488b-8d9d-862ce8581fa9" containerID="b6436c5b6d8f152e7a64185eb0822e10182ef56f7e00b5c3cd7e12ec9274c737" exitCode=0 Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.200172 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"111e79bc-00ab-488b-8d9d-862ce8581fa9","Type":"ContainerDied","Data":"b6436c5b6d8f152e7a64185eb0822e10182ef56f7e00b5c3cd7e12ec9274c737"} Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.200368 4898 patch_prober.go:28] interesting pod/controller-manager-64446bcfb4-56ccg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.200447 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" podUID="a9fa4a89-d754-4f84-80be-a552772613dc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.795596 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.797049 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.811517 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.836990 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-var-lock\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.837129 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77480be5-9488-434e-8105-0fc9237cae46-kube-api-access\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.837176 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-kubelet-dir\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.938876 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77480be5-9488-434e-8105-0fc9237cae46-kube-api-access\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.939072 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-kubelet-dir\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.939245 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-var-lock\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.939285 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-kubelet-dir\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.939390 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-var-lock\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.969747 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77480be5-9488-434e-8105-0fc9237cae46-kube-api-access\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:38 crc kubenswrapper[4898]: I0313 14:00:38.130632 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:38 crc kubenswrapper[4898]: I0313 14:00:38.227391 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fwrwc" podStartSLOduration=207.22737001 podStartE2EDuration="3m27.22737001s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:00:38.224066711 +0000 UTC m=+273.225654950" watchObservedRunningTime="2026-03-13 14:00:38.22737001 +0000 UTC m=+273.228958249" Mar 13 14:00:39 crc kubenswrapper[4898]: E0313 14:00:39.107208 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hn9sl" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.155148 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.161661 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.168236 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.220273 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" event={"ID":"c222126e-abe0-43e6-95c8-cc6946c967ae","Type":"ContainerDied","Data":"a52e531bb6ba827a038762b2905cd91f8d32e0f4df6accde072de13447061bee"} Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.220323 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a52e531bb6ba827a038762b2905cd91f8d32e0f4df6accde072de13447061bee" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.220379 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.225179 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"111e79bc-00ab-488b-8d9d-862ce8581fa9","Type":"ContainerDied","Data":"f600f28d004fed11e321fdd1224ecfbcb4d5947d8d946b747518b734300fe345"} Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.225220 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f600f28d004fed11e321fdd1224ecfbcb4d5947d8d946b747518b734300fe345" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.225231 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.226949 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"189d7154-fefa-48d1-b98f-5f86a30682b2","Type":"ContainerDied","Data":"687b75fd7d1c7201c0a5fe3f1524aea3bd6ecd8ac0281603109401585ffb43a0"} Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.226967 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="687b75fd7d1c7201c0a5fe3f1524aea3bd6ecd8ac0281603109401585ffb43a0" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.227016 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.259822 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c222126e-abe0-43e6-95c8-cc6946c967ae-secret-volume\") pod \"c222126e-abe0-43e6-95c8-cc6946c967ae\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.259874 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/189d7154-fefa-48d1-b98f-5f86a30682b2-kube-api-access\") pod \"189d7154-fefa-48d1-b98f-5f86a30682b2\" (UID: \"189d7154-fefa-48d1-b98f-5f86a30682b2\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.259939 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/111e79bc-00ab-488b-8d9d-862ce8581fa9-kube-api-access\") pod \"111e79bc-00ab-488b-8d9d-862ce8581fa9\" (UID: \"111e79bc-00ab-488b-8d9d-862ce8581fa9\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.259989 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/189d7154-fefa-48d1-b98f-5f86a30682b2-kubelet-dir\") pod \"189d7154-fefa-48d1-b98f-5f86a30682b2\" (UID: \"189d7154-fefa-48d1-b98f-5f86a30682b2\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.260033 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/111e79bc-00ab-488b-8d9d-862ce8581fa9-kubelet-dir\") pod \"111e79bc-00ab-488b-8d9d-862ce8581fa9\" (UID: \"111e79bc-00ab-488b-8d9d-862ce8581fa9\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.260064 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtpqf\" (UniqueName: \"kubernetes.io/projected/c222126e-abe0-43e6-95c8-cc6946c967ae-kube-api-access-xtpqf\") pod \"c222126e-abe0-43e6-95c8-cc6946c967ae\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.260091 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c222126e-abe0-43e6-95c8-cc6946c967ae-config-volume\") pod \"c222126e-abe0-43e6-95c8-cc6946c967ae\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.260142 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/111e79bc-00ab-488b-8d9d-862ce8581fa9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "111e79bc-00ab-488b-8d9d-862ce8581fa9" (UID: "111e79bc-00ab-488b-8d9d-862ce8581fa9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.260211 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/189d7154-fefa-48d1-b98f-5f86a30682b2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "189d7154-fefa-48d1-b98f-5f86a30682b2" (UID: "189d7154-fefa-48d1-b98f-5f86a30682b2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.260469 4898 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/111e79bc-00ab-488b-8d9d-862ce8581fa9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.260494 4898 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/189d7154-fefa-48d1-b98f-5f86a30682b2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.260794 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c222126e-abe0-43e6-95c8-cc6946c967ae-config-volume" (OuterVolumeSpecName: "config-volume") pod "c222126e-abe0-43e6-95c8-cc6946c967ae" (UID: "c222126e-abe0-43e6-95c8-cc6946c967ae"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.265094 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189d7154-fefa-48d1-b98f-5f86a30682b2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "189d7154-fefa-48d1-b98f-5f86a30682b2" (UID: "189d7154-fefa-48d1-b98f-5f86a30682b2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.265873 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c222126e-abe0-43e6-95c8-cc6946c967ae-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c222126e-abe0-43e6-95c8-cc6946c967ae" (UID: "c222126e-abe0-43e6-95c8-cc6946c967ae"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.267649 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/111e79bc-00ab-488b-8d9d-862ce8581fa9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "111e79bc-00ab-488b-8d9d-862ce8581fa9" (UID: "111e79bc-00ab-488b-8d9d-862ce8581fa9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.277087 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c222126e-abe0-43e6-95c8-cc6946c967ae-kube-api-access-xtpqf" (OuterVolumeSpecName: "kube-api-access-xtpqf") pod "c222126e-abe0-43e6-95c8-cc6946c967ae" (UID: "c222126e-abe0-43e6-95c8-cc6946c967ae"). InnerVolumeSpecName "kube-api-access-xtpqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.362085 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtpqf\" (UniqueName: \"kubernetes.io/projected/c222126e-abe0-43e6-95c8-cc6946c967ae-kube-api-access-xtpqf\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.362119 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c222126e-abe0-43e6-95c8-cc6946c967ae-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.362128 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c222126e-abe0-43e6-95c8-cc6946c967ae-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.362137 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/189d7154-fefa-48d1-b98f-5f86a30682b2-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.362146 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/111e79bc-00ab-488b-8d9d-862ce8581fa9-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.603163 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-786d64999b-pd42k_c46150e0-fd12-4e99-8de9-82630b55487b/route-controller-manager/0.log" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.603400 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.613455 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.650770 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665616 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lszm\" (UniqueName: \"kubernetes.io/projected/c46150e0-fd12-4e99-8de9-82630b55487b-kube-api-access-9lszm\") pod \"c46150e0-fd12-4e99-8de9-82630b55487b\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665677 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-client-ca\") pod \"a9fa4a89-d754-4f84-80be-a552772613dc\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665703 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-config\") pod \"a9fa4a89-d754-4f84-80be-a552772613dc\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665722 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c46150e0-fd12-4e99-8de9-82630b55487b-serving-cert\") pod \"c46150e0-fd12-4e99-8de9-82630b55487b\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665767 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-config\") pod \"c46150e0-fd12-4e99-8de9-82630b55487b\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665810 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk7gg\" (UniqueName: \"kubernetes.io/projected/a9fa4a89-d754-4f84-80be-a552772613dc-kube-api-access-jk7gg\") pod \"a9fa4a89-d754-4f84-80be-a552772613dc\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665828 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fa4a89-d754-4f84-80be-a552772613dc-serving-cert\") pod \"a9fa4a89-d754-4f84-80be-a552772613dc\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665843 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-client-ca\") pod \"c46150e0-fd12-4e99-8de9-82630b55487b\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665877 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-proxy-ca-bundles\") pod \"a9fa4a89-d754-4f84-80be-a552772613dc\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.666800 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-client-ca" (OuterVolumeSpecName: "client-ca") pod "c46150e0-fd12-4e99-8de9-82630b55487b" (UID: "c46150e0-fd12-4e99-8de9-82630b55487b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.666825 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-config" (OuterVolumeSpecName: "config") pod "c46150e0-fd12-4e99-8de9-82630b55487b" (UID: "c46150e0-fd12-4e99-8de9-82630b55487b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.667278 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-config" (OuterVolumeSpecName: "config") pod "a9fa4a89-d754-4f84-80be-a552772613dc" (UID: "a9fa4a89-d754-4f84-80be-a552772613dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.667400 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-client-ca" (OuterVolumeSpecName: "client-ca") pod "a9fa4a89-d754-4f84-80be-a552772613dc" (UID: "a9fa4a89-d754-4f84-80be-a552772613dc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.667637 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.667665 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.667682 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.667695 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.670620 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a9fa4a89-d754-4f84-80be-a552772613dc" (UID: "a9fa4a89-d754-4f84-80be-a552772613dc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.670764 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c46150e0-fd12-4e99-8de9-82630b55487b-kube-api-access-9lszm" (OuterVolumeSpecName: "kube-api-access-9lszm") pod "c46150e0-fd12-4e99-8de9-82630b55487b" (UID: "c46150e0-fd12-4e99-8de9-82630b55487b"). InnerVolumeSpecName "kube-api-access-9lszm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.671025 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c46150e0-fd12-4e99-8de9-82630b55487b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c46150e0-fd12-4e99-8de9-82630b55487b" (UID: "c46150e0-fd12-4e99-8de9-82630b55487b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.671422 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9fa4a89-d754-4f84-80be-a552772613dc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a9fa4a89-d754-4f84-80be-a552772613dc" (UID: "a9fa4a89-d754-4f84-80be-a552772613dc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.672107 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9fa4a89-d754-4f84-80be-a552772613dc-kube-api-access-jk7gg" (OuterVolumeSpecName: "kube-api-access-jk7gg") pod "a9fa4a89-d754-4f84-80be-a552772613dc" (UID: "a9fa4a89-d754-4f84-80be-a552772613dc"). InnerVolumeSpecName "kube-api-access-jk7gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.768767 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.768790 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lszm\" (UniqueName: \"kubernetes.io/projected/c46150e0-fd12-4e99-8de9-82630b55487b-kube-api-access-9lszm\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.768799 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c46150e0-fd12-4e99-8de9-82630b55487b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.768810 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk7gg\" (UniqueName: \"kubernetes.io/projected/a9fa4a89-d754-4f84-80be-a552772613dc-kube-api-access-jk7gg\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.768818 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fa4a89-d754-4f84-80be-a552772613dc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.234487 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"77480be5-9488-434e-8105-0fc9237cae46","Type":"ContainerStarted","Data":"dd7ab09c8cacba65b6094ca53d6f8610851fabd3f291b2f1e2fc1acdbaf4b50f"} Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.240105 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-786d64999b-pd42k_c46150e0-fd12-4e99-8de9-82630b55487b/route-controller-manager/0.log" Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.240183 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" event={"ID":"c46150e0-fd12-4e99-8de9-82630b55487b","Type":"ContainerDied","Data":"4555d9ea560dda62fa3bb4e1189ec322eb21486dee3eb5d0e98385448c4e0a25"} Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.240216 4898 scope.go:117] "RemoveContainer" containerID="f5ee2f06681290cd6019466610f90492ac1d27b61c3786e195c3276f8b9bc87f" Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.240254 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.243229 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" event={"ID":"a9fa4a89-d754-4f84-80be-a552772613dc","Type":"ContainerDied","Data":"752fe552b8c17ef86982e87196efe9daa90c3035620e4b9c932c71cb6e722835"} Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.243356 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.249353 4898 generic.go:334] "Generic (PLEG): container finished" podID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerID="8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d" exitCode=0 Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.249423 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h97c9" event={"ID":"f85f72a8-3887-4867-8a9c-649992ce23f1","Type":"ContainerDied","Data":"8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d"} Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.283732 4898 scope.go:117] "RemoveContainer" containerID="5450739f432e17c50d2eee20e629b8170cfc52fa713a03177856d3eacd247c1a" Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.286275 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k"] Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.289259 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k"] Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.301535 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64446bcfb4-56ccg"] Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.305068 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64446bcfb4-56ccg"] Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.259394 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"77480be5-9488-434e-8105-0fc9237cae46","Type":"ContainerStarted","Data":"e6def40c01eb99d1e2a262735e66a81eae49e5b4d7cd72298170f434700dfefb"} Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.286219 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=4.286194427 podStartE2EDuration="4.286194427s" podCreationTimestamp="2026-03-13 14:00:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:00:41.284607809 +0000 UTC m=+276.286196068" watchObservedRunningTime="2026-03-13 14:00:41.286194427 +0000 UTC m=+276.287782666" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.750687 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9fa4a89-d754-4f84-80be-a552772613dc" path="/var/lib/kubelet/pods/a9fa4a89-d754-4f84-80be-a552772613dc/volumes" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.751866 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c46150e0-fd12-4e99-8de9-82630b55487b" path="/var/lib/kubelet/pods/c46150e0-fd12-4e99-8de9-82630b55487b/volumes" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.790747 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67c9866bb9-9276f"] Mar 13 14:00:41 crc kubenswrapper[4898]: E0313 14:00:41.793284 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46150e0-fd12-4e99-8de9-82630b55487b" containerName="route-controller-manager" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793310 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46150e0-fd12-4e99-8de9-82630b55487b" containerName="route-controller-manager" Mar 13 14:00:41 crc kubenswrapper[4898]: E0313 14:00:41.793324 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189d7154-fefa-48d1-b98f-5f86a30682b2" containerName="pruner" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793329 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="189d7154-fefa-48d1-b98f-5f86a30682b2" containerName="pruner" Mar 13 14:00:41 crc kubenswrapper[4898]: E0313 14:00:41.793338 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9fa4a89-d754-4f84-80be-a552772613dc" containerName="controller-manager" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793344 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fa4a89-d754-4f84-80be-a552772613dc" containerName="controller-manager" Mar 13 14:00:41 crc kubenswrapper[4898]: E0313 14:00:41.793355 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c222126e-abe0-43e6-95c8-cc6946c967ae" containerName="collect-profiles" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793361 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c222126e-abe0-43e6-95c8-cc6946c967ae" containerName="collect-profiles" Mar 13 14:00:41 crc kubenswrapper[4898]: E0313 14:00:41.793371 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111e79bc-00ab-488b-8d9d-862ce8581fa9" containerName="pruner" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793378 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="111e79bc-00ab-488b-8d9d-862ce8581fa9" containerName="pruner" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793470 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="111e79bc-00ab-488b-8d9d-862ce8581fa9" containerName="pruner" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793479 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9fa4a89-d754-4f84-80be-a552772613dc" containerName="controller-manager" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793490 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c46150e0-fd12-4e99-8de9-82630b55487b" containerName="route-controller-manager" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793502 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="189d7154-fefa-48d1-b98f-5f86a30682b2" containerName="pruner" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793510 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c222126e-abe0-43e6-95c8-cc6946c967ae" containerName="collect-profiles" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793859 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b"] Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.794350 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.795196 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.798314 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.798491 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.799215 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.799322 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.799435 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b"] Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.799709 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.799768 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.799823 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.799941 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.800037 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.800063 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.799855 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.803932 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.805442 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c9866bb9-9276f"] Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.806657 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898350 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-client-ca\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898383 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-proxy-ca-bundles\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898418 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnwjn\" (UniqueName: \"kubernetes.io/projected/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-kube-api-access-mnwjn\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898631 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-config\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898648 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsdjp\" (UniqueName: \"kubernetes.io/projected/48b4d86a-9b94-4913-a35f-fd5e449ca40b-kube-api-access-gsdjp\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898689 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-client-ca\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898721 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-serving-cert\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898753 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-config\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898770 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b4d86a-9b94-4913-a35f-fd5e449ca40b-serving-cert\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000357 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-config\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000398 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b4d86a-9b94-4913-a35f-fd5e449ca40b-serving-cert\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000428 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-proxy-ca-bundles\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000445 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-client-ca\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000477 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnwjn\" (UniqueName: \"kubernetes.io/projected/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-kube-api-access-mnwjn\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000499 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-config\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000517 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsdjp\" (UniqueName: \"kubernetes.io/projected/48b4d86a-9b94-4913-a35f-fd5e449ca40b-kube-api-access-gsdjp\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000563 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-client-ca\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000605 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-serving-cert\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.002376 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-client-ca\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.002601 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-config\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.002663 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-proxy-ca-bundles\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.002957 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-client-ca\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.003512 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-config\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.009583 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b4d86a-9b94-4913-a35f-fd5e449ca40b-serving-cert\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.015795 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-serving-cert\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.025256 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsdjp\" (UniqueName: \"kubernetes.io/projected/48b4d86a-9b94-4913-a35f-fd5e449ca40b-kube-api-access-gsdjp\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.033930 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnwjn\" (UniqueName: \"kubernetes.io/projected/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-kube-api-access-mnwjn\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.164522 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.168886 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.289243 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" event={"ID":"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce","Type":"ContainerStarted","Data":"ce4b9269a12a5818cb6b78f9abcf90162aaab004f9bc6b1371639c51781f053a"} Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.295547 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h97c9" event={"ID":"f85f72a8-3887-4867-8a9c-649992ce23f1","Type":"ContainerStarted","Data":"cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8"} Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.304719 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" podStartSLOduration=35.715431971 podStartE2EDuration="42.304705776s" podCreationTimestamp="2026-03-13 14:00:00 +0000 UTC" firstStartedPulling="2026-03-13 14:00:35.039014403 +0000 UTC m=+270.040602652" lastFinishedPulling="2026-03-13 14:00:41.628288198 +0000 UTC m=+276.629876457" observedRunningTime="2026-03-13 14:00:42.301872468 +0000 UTC m=+277.303460707" watchObservedRunningTime="2026-03-13 14:00:42.304705776 +0000 UTC m=+277.306294015" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.328068 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h97c9" podStartSLOduration=10.185673494 podStartE2EDuration="44.328052703s" podCreationTimestamp="2026-03-13 13:59:58 +0000 UTC" firstStartedPulling="2026-03-13 14:00:07.48136777 +0000 UTC m=+242.482956019" lastFinishedPulling="2026-03-13 14:00:41.623746969 +0000 UTC m=+276.625335228" observedRunningTime="2026-03-13 14:00:42.326932816 +0000 UTC m=+277.328521075" watchObservedRunningTime="2026-03-13 14:00:42.328052703 +0000 UTC m=+277.329640942" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.374979 4898 csr.go:261] certificate signing request csr-ps7l6 is approved, waiting to be issued Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.378469 4898 csr.go:257] certificate signing request csr-ps7l6 is issued Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.603021 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b"] Mar 13 14:00:42 crc kubenswrapper[4898]: W0313 14:00:42.611222 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48b4d86a_9b94_4913_a35f_fd5e449ca40b.slice/crio-13758db88f06f5e481700996753a013ebf1c45a10dc6dbacc0a63821fd99241f WatchSource:0}: Error finding container 13758db88f06f5e481700996753a013ebf1c45a10dc6dbacc0a63821fd99241f: Status 404 returned error can't find the container with id 13758db88f06f5e481700996753a013ebf1c45a10dc6dbacc0a63821fd99241f Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.671018 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c9866bb9-9276f"] Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.302499 4898 generic.go:334] "Generic (PLEG): container finished" podID="4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce" containerID="ce4b9269a12a5818cb6b78f9abcf90162aaab004f9bc6b1371639c51781f053a" exitCode=0 Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.302597 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" event={"ID":"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce","Type":"ContainerDied","Data":"ce4b9269a12a5818cb6b78f9abcf90162aaab004f9bc6b1371639c51781f053a"} Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.304869 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" event={"ID":"48b4d86a-9b94-4913-a35f-fd5e449ca40b","Type":"ContainerStarted","Data":"22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01"} Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.304915 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" event={"ID":"48b4d86a-9b94-4913-a35f-fd5e449ca40b","Type":"ContainerStarted","Data":"13758db88f06f5e481700996753a013ebf1c45a10dc6dbacc0a63821fd99241f"} Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.305112 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.306578 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" event={"ID":"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0","Type":"ContainerStarted","Data":"d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7"} Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.306633 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" event={"ID":"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0","Type":"ContainerStarted","Data":"586304a4e2dd4eea4580413987b917096a6c4f3326c8482af26767ff95bd2378"} Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.306653 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.307776 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" event={"ID":"aa1ed4c8-e4bd-4352-bee3-404f16244ea3","Type":"ContainerStarted","Data":"f3acfddb5fa32ce7ed2202cdd792a2b1d7de4b1d204fbdc39e6814928f1b0f60"} Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.339073 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.343519 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.351000 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" podStartSLOduration=112.810475586 podStartE2EDuration="2m43.350991238s" podCreationTimestamp="2026-03-13 13:58:00 +0000 UTC" firstStartedPulling="2026-03-13 13:59:52.149855785 +0000 UTC m=+227.151444024" lastFinishedPulling="2026-03-13 14:00:42.690371437 +0000 UTC m=+277.691959676" observedRunningTime="2026-03-13 14:00:43.347859334 +0000 UTC m=+278.349447573" watchObservedRunningTime="2026-03-13 14:00:43.350991238 +0000 UTC m=+278.352579477" Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.371267 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" podStartSLOduration=10.371250132 podStartE2EDuration="10.371250132s" podCreationTimestamp="2026-03-13 14:00:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:00:43.367313198 +0000 UTC m=+278.368901457" watchObservedRunningTime="2026-03-13 14:00:43.371250132 +0000 UTC m=+278.372838371" Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.380663 4898 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-27 20:25:45.788217123 +0000 UTC Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.380704 4898 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6942h25m2.407515676s for next certificate rotation Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.390784 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" podStartSLOduration=11.390769297 podStartE2EDuration="11.390769297s" podCreationTimestamp="2026-03-13 14:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:00:43.3887817 +0000 UTC m=+278.390369949" watchObservedRunningTime="2026-03-13 14:00:43.390769297 +0000 UTC m=+278.392357536" Mar 13 14:00:44 crc kubenswrapper[4898]: I0313 14:00:44.318086 4898 generic.go:334] "Generic (PLEG): container finished" podID="aa1ed4c8-e4bd-4352-bee3-404f16244ea3" containerID="f3acfddb5fa32ce7ed2202cdd792a2b1d7de4b1d204fbdc39e6814928f1b0f60" exitCode=0 Mar 13 14:00:44 crc kubenswrapper[4898]: I0313 14:00:44.318186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" event={"ID":"aa1ed4c8-e4bd-4352-bee3-404f16244ea3","Type":"ContainerDied","Data":"f3acfddb5fa32ce7ed2202cdd792a2b1d7de4b1d204fbdc39e6814928f1b0f60"} Mar 13 14:00:44 crc kubenswrapper[4898]: I0313 14:00:44.380965 4898 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-27 14:23:21.897099379 +0000 UTC Mar 13 14:00:44 crc kubenswrapper[4898]: I0313 14:00:44.381014 4898 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6216h22m37.516088167s for next certificate rotation Mar 13 14:00:44 crc kubenswrapper[4898]: I0313 14:00:44.600979 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" Mar 13 14:00:44 crc kubenswrapper[4898]: I0313 14:00:44.635169 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlbbz\" (UniqueName: \"kubernetes.io/projected/4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce-kube-api-access-dlbbz\") pod \"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce\" (UID: \"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce\") " Mar 13 14:00:44 crc kubenswrapper[4898]: I0313 14:00:44.643268 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce-kube-api-access-dlbbz" (OuterVolumeSpecName: "kube-api-access-dlbbz") pod "4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce" (UID: "4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce"). InnerVolumeSpecName "kube-api-access-dlbbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:44 crc kubenswrapper[4898]: I0313 14:00:44.741392 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlbbz\" (UniqueName: \"kubernetes.io/projected/4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce-kube-api-access-dlbbz\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:45 crc kubenswrapper[4898]: I0313 14:00:45.340601 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" Mar 13 14:00:45 crc kubenswrapper[4898]: I0313 14:00:45.340669 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" event={"ID":"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce","Type":"ContainerDied","Data":"4cdc004944646e848df2358e59a867264f56b2a1a5573319599edf0e300fc922"} Mar 13 14:00:45 crc kubenswrapper[4898]: I0313 14:00:45.340719 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cdc004944646e848df2358e59a867264f56b2a1a5573319599edf0e300fc922" Mar 13 14:00:45 crc kubenswrapper[4898]: I0313 14:00:45.628062 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" Mar 13 14:00:45 crc kubenswrapper[4898]: I0313 14:00:45.755761 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7p5d\" (UniqueName: \"kubernetes.io/projected/aa1ed4c8-e4bd-4352-bee3-404f16244ea3-kube-api-access-c7p5d\") pod \"aa1ed4c8-e4bd-4352-bee3-404f16244ea3\" (UID: \"aa1ed4c8-e4bd-4352-bee3-404f16244ea3\") " Mar 13 14:00:45 crc kubenswrapper[4898]: I0313 14:00:45.765885 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa1ed4c8-e4bd-4352-bee3-404f16244ea3-kube-api-access-c7p5d" (OuterVolumeSpecName: "kube-api-access-c7p5d") pod "aa1ed4c8-e4bd-4352-bee3-404f16244ea3" (UID: "aa1ed4c8-e4bd-4352-bee3-404f16244ea3"). InnerVolumeSpecName "kube-api-access-c7p5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:45 crc kubenswrapper[4898]: I0313 14:00:45.857678 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7p5d\" (UniqueName: \"kubernetes.io/projected/aa1ed4c8-e4bd-4352-bee3-404f16244ea3-kube-api-access-c7p5d\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:46 crc kubenswrapper[4898]: I0313 14:00:46.347428 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" event={"ID":"aa1ed4c8-e4bd-4352-bee3-404f16244ea3","Type":"ContainerDied","Data":"a7c95316f7425af660b27292d15441df4c43eb94ff232136664abdd1d5a272eb"} Mar 13 14:00:46 crc kubenswrapper[4898]: I0313 14:00:46.347986 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7c95316f7425af660b27292d15441df4c43eb94ff232136664abdd1d5a272eb" Mar 13 14:00:46 crc kubenswrapper[4898]: I0313 14:00:46.347622 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" Mar 13 14:00:46 crc kubenswrapper[4898]: I0313 14:00:46.349925 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh84s" event={"ID":"4ae77efc-55ca-4eee-8817-9c21d0bafa6e","Type":"ContainerStarted","Data":"804bb12e6a7adbbf4efa1f3ec85b57e7a62babe514eec20e4355f14bea3bdf4a"} Mar 13 14:00:47 crc kubenswrapper[4898]: I0313 14:00:47.368620 4898 generic.go:334] "Generic (PLEG): container finished" podID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerID="804bb12e6a7adbbf4efa1f3ec85b57e7a62babe514eec20e4355f14bea3bdf4a" exitCode=0 Mar 13 14:00:47 crc kubenswrapper[4898]: I0313 14:00:47.368690 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh84s" event={"ID":"4ae77efc-55ca-4eee-8817-9c21d0bafa6e","Type":"ContainerDied","Data":"804bb12e6a7adbbf4efa1f3ec85b57e7a62babe514eec20e4355f14bea3bdf4a"} Mar 13 14:00:47 crc kubenswrapper[4898]: I0313 14:00:47.368740 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh84s" event={"ID":"4ae77efc-55ca-4eee-8817-9c21d0bafa6e","Type":"ContainerStarted","Data":"b5e3c3eb492b54ad0a87f08620309d02bb7883094244c1b703d4a244c8429e87"} Mar 13 14:00:47 crc kubenswrapper[4898]: I0313 14:00:47.392106 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xh84s" podStartSLOduration=2.335606893 podStartE2EDuration="51.392070569s" podCreationTimestamp="2026-03-13 13:59:56 +0000 UTC" firstStartedPulling="2026-03-13 13:59:57.712988819 +0000 UTC m=+232.714577048" lastFinishedPulling="2026-03-13 14:00:46.769452485 +0000 UTC m=+281.771040724" observedRunningTime="2026-03-13 14:00:47.388279058 +0000 UTC m=+282.389867317" watchObservedRunningTime="2026-03-13 14:00:47.392070569 +0000 UTC m=+282.393658808" Mar 13 14:00:48 crc kubenswrapper[4898]: I0313 14:00:48.384166 4898 generic.go:334] "Generic (PLEG): container finished" podID="43acaee8-efc8-4156-b28c-b493f241ac53" containerID="166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b" exitCode=0 Mar 13 14:00:48 crc kubenswrapper[4898]: I0313 14:00:48.384245 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvvz2" event={"ID":"43acaee8-efc8-4156-b28c-b493f241ac53","Type":"ContainerDied","Data":"166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b"} Mar 13 14:00:48 crc kubenswrapper[4898]: I0313 14:00:48.426860 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 14:00:48 crc kubenswrapper[4898]: I0313 14:00:48.426954 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 14:00:48 crc kubenswrapper[4898]: I0313 14:00:48.675688 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 14:00:49 crc kubenswrapper[4898]: I0313 14:00:49.135120 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:00:49 crc kubenswrapper[4898]: I0313 14:00:49.135589 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:00:49 crc kubenswrapper[4898]: I0313 14:00:49.135664 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:00:49 crc kubenswrapper[4898]: I0313 14:00:49.138201 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:00:49 crc kubenswrapper[4898]: I0313 14:00:49.138310 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56" gracePeriod=600 Mar 13 14:00:49 crc kubenswrapper[4898]: I0313 14:00:49.425352 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 14:00:50 crc kubenswrapper[4898]: I0313 14:00:50.396095 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56" exitCode=0 Mar 13 14:00:50 crc kubenswrapper[4898]: I0313 14:00:50.396170 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56"} Mar 13 14:00:54 crc kubenswrapper[4898]: I0313 14:00:54.419157 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"ef8034867c7dd4fe3e16f610be3edcf45ba0ba5b7440cc5634ef7ce86e520b52"} Mar 13 14:00:56 crc kubenswrapper[4898]: I0313 14:00:56.435494 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-974qp" event={"ID":"183d86e9-cd5c-45ed-a460-bb6169e07c72","Type":"ContainerStarted","Data":"4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47"} Mar 13 14:00:56 crc kubenswrapper[4898]: I0313 14:00:56.441383 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-btkxt" event={"ID":"7794a943-5fec-485e-86bf-f104ed6ae070","Type":"ContainerStarted","Data":"67b3d784e0ee63e0bd8e175cf0b8537e1ca35f7833ddbd4c0468016c4030500b"} Mar 13 14:00:56 crc kubenswrapper[4898]: I0313 14:00:56.452942 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvvz2" event={"ID":"43acaee8-efc8-4156-b28c-b493f241ac53","Type":"ContainerStarted","Data":"8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec"} Mar 13 14:00:56 crc kubenswrapper[4898]: I0313 14:00:56.478762 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dvvz2" podStartSLOduration=2.426755128 podStartE2EDuration="1m0.478736125s" podCreationTimestamp="2026-03-13 13:59:56 +0000 UTC" firstStartedPulling="2026-03-13 13:59:57.712081718 +0000 UTC m=+232.713669957" lastFinishedPulling="2026-03-13 14:00:55.764062715 +0000 UTC m=+290.765650954" observedRunningTime="2026-03-13 14:00:56.473200693 +0000 UTC m=+291.474788952" watchObservedRunningTime="2026-03-13 14:00:56.478736125 +0000 UTC m=+291.480324364" Mar 13 14:00:56 crc kubenswrapper[4898]: I0313 14:00:56.804492 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xh84s" Mar 13 14:00:56 crc kubenswrapper[4898]: I0313 14:00:56.804563 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xh84s" Mar 13 14:00:56 crc kubenswrapper[4898]: I0313 14:00:56.838273 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xh84s" Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.459995 4898 generic.go:334] "Generic (PLEG): container finished" podID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerID="4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47" exitCode=0 Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.460046 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-974qp" event={"ID":"183d86e9-cd5c-45ed-a460-bb6169e07c72","Type":"ContainerDied","Data":"4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47"} Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.462631 4898 generic.go:334] "Generic (PLEG): container finished" podID="7794a943-5fec-485e-86bf-f104ed6ae070" containerID="67b3d784e0ee63e0bd8e175cf0b8537e1ca35f7833ddbd4c0468016c4030500b" exitCode=0 Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.462681 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-btkxt" event={"ID":"7794a943-5fec-485e-86bf-f104ed6ae070","Type":"ContainerDied","Data":"67b3d784e0ee63e0bd8e175cf0b8537e1ca35f7833ddbd4c0468016c4030500b"} Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.466189 4898 generic.go:334] "Generic (PLEG): container finished" podID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerID="2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336" exitCode=0 Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.466253 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twh8h" event={"ID":"8f81bcfc-3c35-48e8-a584-961351e8c0e2","Type":"ContainerDied","Data":"2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336"} Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.469659 4898 generic.go:334] "Generic (PLEG): container finished" podID="a990881e-0caf-4096-a372-4cdad69006c1" containerID="4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e" exitCode=0 Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.469693 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppq6v" event={"ID":"a990881e-0caf-4096-a372-4cdad69006c1","Type":"ContainerDied","Data":"4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e"} Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.472741 4898 generic.go:334] "Generic (PLEG): container finished" podID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerID="aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73" exitCode=0 Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.472993 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hn9sl" event={"ID":"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1","Type":"ContainerDied","Data":"aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73"} Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.530413 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xh84s" Mar 13 14:00:57 crc kubenswrapper[4898]: E0313 14:00:57.545557 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod183d86e9_cd5c_45ed_a460_bb6169e07c72.slice/crio-conmon-4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda990881e_0caf_4096_a372_4cdad69006c1.slice/crio-conmon-4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e.scope\": RecentStats: unable to find data in memory cache]" Mar 13 14:00:58 crc kubenswrapper[4898]: I0313 14:00:58.293097 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xh84s"] Mar 13 14:00:58 crc kubenswrapper[4898]: I0313 14:00:58.812458 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-djn5q"] Mar 13 14:00:59 crc kubenswrapper[4898]: I0313 14:00:59.483089 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xh84s" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerName="registry-server" containerID="cri-o://b5e3c3eb492b54ad0a87f08620309d02bb7883094244c1b703d4a244c8429e87" gracePeriod=2 Mar 13 14:01:00 crc kubenswrapper[4898]: I0313 14:01:00.490978 4898 generic.go:334] "Generic (PLEG): container finished" podID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerID="b5e3c3eb492b54ad0a87f08620309d02bb7883094244c1b703d4a244c8429e87" exitCode=0 Mar 13 14:01:00 crc kubenswrapper[4898]: I0313 14:01:00.491036 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh84s" event={"ID":"4ae77efc-55ca-4eee-8817-9c21d0bafa6e","Type":"ContainerDied","Data":"b5e3c3eb492b54ad0a87f08620309d02bb7883094244c1b703d4a244c8429e87"} Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.752366 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xh84s" Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.882512 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-utilities\") pod \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.882615 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-catalog-content\") pod \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.882678 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nn9n\" (UniqueName: \"kubernetes.io/projected/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-kube-api-access-5nn9n\") pod \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.884285 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-utilities" (OuterVolumeSpecName: "utilities") pod "4ae77efc-55ca-4eee-8817-9c21d0bafa6e" (UID: "4ae77efc-55ca-4eee-8817-9c21d0bafa6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.888274 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-kube-api-access-5nn9n" (OuterVolumeSpecName: "kube-api-access-5nn9n") pod "4ae77efc-55ca-4eee-8817-9c21d0bafa6e" (UID: "4ae77efc-55ca-4eee-8817-9c21d0bafa6e"). InnerVolumeSpecName "kube-api-access-5nn9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.941113 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ae77efc-55ca-4eee-8817-9c21d0bafa6e" (UID: "4ae77efc-55ca-4eee-8817-9c21d0bafa6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.984111 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.984154 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nn9n\" (UniqueName: \"kubernetes.io/projected/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-kube-api-access-5nn9n\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.984171 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:02 crc kubenswrapper[4898]: I0313 14:01:02.505126 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh84s" event={"ID":"4ae77efc-55ca-4eee-8817-9c21d0bafa6e","Type":"ContainerDied","Data":"45fc69d27eaeb1e52f659215a5860c090893736d0fa5aca134749f73422aadc9"} Mar 13 14:01:02 crc kubenswrapper[4898]: I0313 14:01:02.505180 4898 scope.go:117] "RemoveContainer" containerID="b5e3c3eb492b54ad0a87f08620309d02bb7883094244c1b703d4a244c8429e87" Mar 13 14:01:02 crc kubenswrapper[4898]: I0313 14:01:02.505275 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xh84s" Mar 13 14:01:02 crc kubenswrapper[4898]: I0313 14:01:02.508547 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-974qp" event={"ID":"183d86e9-cd5c-45ed-a460-bb6169e07c72","Type":"ContainerStarted","Data":"b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635"} Mar 13 14:01:02 crc kubenswrapper[4898]: I0313 14:01:02.534023 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xh84s"] Mar 13 14:01:02 crc kubenswrapper[4898]: I0313 14:01:02.538797 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xh84s"] Mar 13 14:01:02 crc kubenswrapper[4898]: I0313 14:01:02.885263 4898 scope.go:117] "RemoveContainer" containerID="804bb12e6a7adbbf4efa1f3ec85b57e7a62babe514eec20e4355f14bea3bdf4a" Mar 13 14:01:03 crc kubenswrapper[4898]: I0313 14:01:03.084060 4898 scope.go:117] "RemoveContainer" containerID="359074a54fdd2abe01e5471c8009872f5ca05eb132b157ad005435e3bc55c0f9" Mar 13 14:01:03 crc kubenswrapper[4898]: I0313 14:01:03.531331 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-974qp" podStartSLOduration=42.014150816 podStartE2EDuration="1m4.531315054s" podCreationTimestamp="2026-03-13 13:59:59 +0000 UTC" firstStartedPulling="2026-03-13 14:00:39.103100053 +0000 UTC m=+274.104688292" lastFinishedPulling="2026-03-13 14:01:01.620264291 +0000 UTC m=+296.621852530" observedRunningTime="2026-03-13 14:01:03.52948117 +0000 UTC m=+298.531069409" watchObservedRunningTime="2026-03-13 14:01:03.531315054 +0000 UTC m=+298.532903293" Mar 13 14:01:03 crc kubenswrapper[4898]: I0313 14:01:03.746494 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" path="/var/lib/kubelet/pods/4ae77efc-55ca-4eee-8817-9c21d0bafa6e/volumes" Mar 13 14:01:04 crc kubenswrapper[4898]: I0313 14:01:04.521886 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppq6v" event={"ID":"a990881e-0caf-4096-a372-4cdad69006c1","Type":"ContainerStarted","Data":"34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc"} Mar 13 14:01:04 crc kubenswrapper[4898]: I0313 14:01:04.526185 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hn9sl" event={"ID":"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1","Type":"ContainerStarted","Data":"34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b"} Mar 13 14:01:04 crc kubenswrapper[4898]: I0313 14:01:04.528339 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-btkxt" event={"ID":"7794a943-5fec-485e-86bf-f104ed6ae070","Type":"ContainerStarted","Data":"72013575671e67bc50654bdc19c7f86358a2ef9f58c688c055736ec45a15a182"} Mar 13 14:01:04 crc kubenswrapper[4898]: I0313 14:01:04.530329 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twh8h" event={"ID":"8f81bcfc-3c35-48e8-a584-961351e8c0e2","Type":"ContainerStarted","Data":"c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c"} Mar 13 14:01:04 crc kubenswrapper[4898]: I0313 14:01:04.544948 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ppq6v" podStartSLOduration=2.181939927 podStartE2EDuration="1m8.544929887s" podCreationTimestamp="2026-03-13 13:59:56 +0000 UTC" firstStartedPulling="2026-03-13 13:59:57.718445189 +0000 UTC m=+232.720033428" lastFinishedPulling="2026-03-13 14:01:04.081435149 +0000 UTC m=+299.083023388" observedRunningTime="2026-03-13 14:01:04.541626728 +0000 UTC m=+299.543214987" watchObservedRunningTime="2026-03-13 14:01:04.544929887 +0000 UTC m=+299.546518136" Mar 13 14:01:04 crc kubenswrapper[4898]: I0313 14:01:04.556974 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-btkxt" podStartSLOduration=40.991937029 podStartE2EDuration="1m5.556958474s" podCreationTimestamp="2026-03-13 13:59:59 +0000 UTC" firstStartedPulling="2026-03-13 14:00:39.103140254 +0000 UTC m=+274.104728523" lastFinishedPulling="2026-03-13 14:01:03.668161729 +0000 UTC m=+298.669749968" observedRunningTime="2026-03-13 14:01:04.555323054 +0000 UTC m=+299.556911293" watchObservedRunningTime="2026-03-13 14:01:04.556958474 +0000 UTC m=+299.558546713" Mar 13 14:01:04 crc kubenswrapper[4898]: I0313 14:01:04.582205 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-twh8h" podStartSLOduration=3.712099006 podStartE2EDuration="1m9.582185555s" podCreationTimestamp="2026-03-13 13:59:55 +0000 UTC" firstStartedPulling="2026-03-13 13:59:57.705931321 +0000 UTC m=+232.707519560" lastFinishedPulling="2026-03-13 14:01:03.57601787 +0000 UTC m=+298.577606109" observedRunningTime="2026-03-13 14:01:04.580959376 +0000 UTC m=+299.582547635" watchObservedRunningTime="2026-03-13 14:01:04.582185555 +0000 UTC m=+299.583773784" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.236960 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-twh8h" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.237032 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-twh8h" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.284343 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-twh8h" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.310647 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hn9sl" podStartSLOduration=4.545949332 podStartE2EDuration="1m8.310625312s" podCreationTimestamp="2026-03-13 13:59:58 +0000 UTC" firstStartedPulling="2026-03-13 13:59:59.783067736 +0000 UTC m=+234.784655975" lastFinishedPulling="2026-03-13 14:01:03.547743716 +0000 UTC m=+298.549331955" observedRunningTime="2026-03-13 14:01:04.613515113 +0000 UTC m=+299.615103372" watchObservedRunningTime="2026-03-13 14:01:06.310625312 +0000 UTC m=+301.312213551" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.441804 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.442545 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.489705 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.615086 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ppq6v" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.615144 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ppq6v" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.616402 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.663617 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ppq6v" Mar 13 14:01:08 crc kubenswrapper[4898]: I0313 14:01:08.838974 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 14:01:08 crc kubenswrapper[4898]: I0313 14:01:08.839059 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 14:01:08 crc kubenswrapper[4898]: I0313 14:01:08.904505 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 14:01:09 crc kubenswrapper[4898]: I0313 14:01:09.408024 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-974qp" Mar 13 14:01:09 crc kubenswrapper[4898]: I0313 14:01:09.408105 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-974qp" Mar 13 14:01:09 crc kubenswrapper[4898]: I0313 14:01:09.620479 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 14:01:09 crc kubenswrapper[4898]: I0313 14:01:09.813512 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 14:01:09 crc kubenswrapper[4898]: I0313 14:01:09.813985 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 14:01:10 crc kubenswrapper[4898]: I0313 14:01:10.465159 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-974qp" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="registry-server" probeResult="failure" output=< Mar 13 14:01:10 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:01:10 crc kubenswrapper[4898]: > Mar 13 14:01:10 crc kubenswrapper[4898]: I0313 14:01:10.866978 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-btkxt" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="registry-server" probeResult="failure" output=< Mar 13 14:01:10 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:01:10 crc kubenswrapper[4898]: > Mar 13 14:01:11 crc kubenswrapper[4898]: I0313 14:01:11.896391 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hn9sl"] Mar 13 14:01:11 crc kubenswrapper[4898]: I0313 14:01:11.898722 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hn9sl" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerName="registry-server" containerID="cri-o://34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b" gracePeriod=2 Mar 13 14:01:12 crc kubenswrapper[4898]: I0313 14:01:12.925306 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67c9866bb9-9276f"] Mar 13 14:01:12 crc kubenswrapper[4898]: I0313 14:01:12.925693 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" podUID="d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" containerName="controller-manager" containerID="cri-o://d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7" gracePeriod=30 Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.020841 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b"] Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.021348 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" podUID="48b4d86a-9b94-4913-a35f-fd5e449ca40b" containerName="route-controller-manager" containerID="cri-o://22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01" gracePeriod=30 Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.433178 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.476065 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.518531 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561094 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-config\") pod \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561150 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b4d86a-9b94-4913-a35f-fd5e449ca40b-serving-cert\") pod \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561177 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-serving-cert\") pod \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561263 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-client-ca\") pod \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561307 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnwjn\" (UniqueName: \"kubernetes.io/projected/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-kube-api-access-mnwjn\") pod \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561335 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsdjp\" (UniqueName: \"kubernetes.io/projected/48b4d86a-9b94-4913-a35f-fd5e449ca40b-kube-api-access-gsdjp\") pod \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561374 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22rm7\" (UniqueName: \"kubernetes.io/projected/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-kube-api-access-22rm7\") pod \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561398 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-config\") pod \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561420 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-proxy-ca-bundles\") pod \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561443 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-client-ca\") pod \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.562261 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-client-ca" (OuterVolumeSpecName: "client-ca") pod "48b4d86a-9b94-4913-a35f-fd5e449ca40b" (UID: "48b4d86a-9b94-4913-a35f-fd5e449ca40b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.562895 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-config" (OuterVolumeSpecName: "config") pod "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" (UID: "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.563041 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" (UID: "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.563240 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-config" (OuterVolumeSpecName: "config") pod "48b4d86a-9b94-4913-a35f-fd5e449ca40b" (UID: "48b4d86a-9b94-4913-a35f-fd5e449ca40b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.563713 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-client-ca" (OuterVolumeSpecName: "client-ca") pod "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" (UID: "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.566573 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b4d86a-9b94-4913-a35f-fd5e449ca40b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "48b4d86a-9b94-4913-a35f-fd5e449ca40b" (UID: "48b4d86a-9b94-4913-a35f-fd5e449ca40b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.566600 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" (UID: "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.566599 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b4d86a-9b94-4913-a35f-fd5e449ca40b-kube-api-access-gsdjp" (OuterVolumeSpecName: "kube-api-access-gsdjp") pod "48b4d86a-9b94-4913-a35f-fd5e449ca40b" (UID: "48b4d86a-9b94-4913-a35f-fd5e449ca40b"). InnerVolumeSpecName "kube-api-access-gsdjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.566836 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-kube-api-access-22rm7" (OuterVolumeSpecName: "kube-api-access-22rm7") pod "b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" (UID: "b8bc0c30-71e1-41d2-8991-1ce9d85d50a1"). InnerVolumeSpecName "kube-api-access-22rm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.566915 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-kube-api-access-mnwjn" (OuterVolumeSpecName: "kube-api-access-mnwjn") pod "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" (UID: "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0"). InnerVolumeSpecName "kube-api-access-mnwjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.590767 4898 generic.go:334] "Generic (PLEG): container finished" podID="48b4d86a-9b94-4913-a35f-fd5e449ca40b" containerID="22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01" exitCode=0 Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.590801 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.590831 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" event={"ID":"48b4d86a-9b94-4913-a35f-fd5e449ca40b","Type":"ContainerDied","Data":"22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01"} Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.590854 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" event={"ID":"48b4d86a-9b94-4913-a35f-fd5e449ca40b","Type":"ContainerDied","Data":"13758db88f06f5e481700996753a013ebf1c45a10dc6dbacc0a63821fd99241f"} Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.590870 4898 scope.go:117] "RemoveContainer" containerID="22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.594237 4898 generic.go:334] "Generic (PLEG): container finished" podID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerID="34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b" exitCode=0 Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.594276 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.594300 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hn9sl" event={"ID":"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1","Type":"ContainerDied","Data":"34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b"} Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.594323 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hn9sl" event={"ID":"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1","Type":"ContainerDied","Data":"0b8d238e1855df1df599d5c20b2f8c47368ca041ea02bd9d799ff8595124e451"} Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.595476 4898 generic.go:334] "Generic (PLEG): container finished" podID="d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" containerID="d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7" exitCode=0 Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.595566 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" event={"ID":"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0","Type":"ContainerDied","Data":"d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7"} Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.595749 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" event={"ID":"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0","Type":"ContainerDied","Data":"586304a4e2dd4eea4580413987b917096a6c4f3326c8482af26767ff95bd2378"} Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.595662 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.607401 4898 scope.go:117] "RemoveContainer" containerID="22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01" Mar 13 14:01:13 crc kubenswrapper[4898]: E0313 14:01:13.608130 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01\": container with ID starting with 22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01 not found: ID does not exist" containerID="22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.608176 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01"} err="failed to get container status \"22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01\": rpc error: code = NotFound desc = could not find container \"22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01\": container with ID starting with 22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01 not found: ID does not exist" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.608205 4898 scope.go:117] "RemoveContainer" containerID="34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.621347 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b"] Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.625707 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b"] Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.626462 4898 scope.go:117] "RemoveContainer" containerID="aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.629740 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67c9866bb9-9276f"] Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.632992 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-67c9866bb9-9276f"] Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.638912 4898 scope.go:117] "RemoveContainer" containerID="2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.650672 4898 scope.go:117] "RemoveContainer" containerID="34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b" Mar 13 14:01:13 crc kubenswrapper[4898]: E0313 14:01:13.651009 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b\": container with ID starting with 34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b not found: ID does not exist" containerID="34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.651046 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b"} err="failed to get container status \"34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b\": rpc error: code = NotFound desc = could not find container \"34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b\": container with ID starting with 34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b not found: ID does not exist" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.651078 4898 scope.go:117] "RemoveContainer" containerID="aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73" Mar 13 14:01:13 crc kubenswrapper[4898]: E0313 14:01:13.651339 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73\": container with ID starting with aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73 not found: ID does not exist" containerID="aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.651370 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73"} err="failed to get container status \"aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73\": rpc error: code = NotFound desc = could not find container \"aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73\": container with ID starting with aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73 not found: ID does not exist" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.651388 4898 scope.go:117] "RemoveContainer" containerID="2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1" Mar 13 14:01:13 crc kubenswrapper[4898]: E0313 14:01:13.651829 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1\": container with ID starting with 2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1 not found: ID does not exist" containerID="2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.651863 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1"} err="failed to get container status \"2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1\": rpc error: code = NotFound desc = could not find container \"2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1\": container with ID starting with 2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1 not found: ID does not exist" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.651886 4898 scope.go:117] "RemoveContainer" containerID="d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662204 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-catalog-content\") pod \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662321 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-utilities\") pod \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662510 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662528 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnwjn\" (UniqueName: \"kubernetes.io/projected/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-kube-api-access-mnwjn\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662537 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsdjp\" (UniqueName: \"kubernetes.io/projected/48b4d86a-9b94-4913-a35f-fd5e449ca40b-kube-api-access-gsdjp\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662547 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22rm7\" (UniqueName: \"kubernetes.io/projected/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-kube-api-access-22rm7\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662556 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662563 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662571 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662581 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662589 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b4d86a-9b94-4913-a35f-fd5e449ca40b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662596 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.663314 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-utilities" (OuterVolumeSpecName: "utilities") pod "b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" (UID: "b8bc0c30-71e1-41d2-8991-1ce9d85d50a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.663366 4898 scope.go:117] "RemoveContainer" containerID="d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7" Mar 13 14:01:13 crc kubenswrapper[4898]: E0313 14:01:13.664032 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7\": container with ID starting with d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7 not found: ID does not exist" containerID="d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.664071 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7"} err="failed to get container status \"d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7\": rpc error: code = NotFound desc = could not find container \"d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7\": container with ID starting with d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7 not found: ID does not exist" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.684393 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" (UID: "b8bc0c30-71e1-41d2-8991-1ce9d85d50a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.749944 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b4d86a-9b94-4913-a35f-fd5e449ca40b" path="/var/lib/kubelet/pods/48b4d86a-9b94-4913-a35f-fd5e449ca40b/volumes" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.751251 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" path="/var/lib/kubelet/pods/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0/volumes" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.764536 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.764599 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.920145 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hn9sl"] Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.927809 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hn9sl"] Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.830269 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf"] Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831287 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerName="registry-server" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831301 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerName="registry-server" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831320 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce" containerName="oc" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831327 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce" containerName="oc" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831338 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerName="extract-content" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831345 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerName="extract-content" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831356 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerName="extract-utilities" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831362 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerName="extract-utilities" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831370 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" containerName="controller-manager" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831376 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" containerName="controller-manager" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831383 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa1ed4c8-e4bd-4352-bee3-404f16244ea3" containerName="oc" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831388 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa1ed4c8-e4bd-4352-bee3-404f16244ea3" containerName="oc" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831397 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerName="extract-utilities" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831403 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerName="extract-utilities" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831410 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerName="extract-content" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831416 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerName="extract-content" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831430 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerName="registry-server" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831436 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerName="registry-server" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831444 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b4d86a-9b94-4913-a35f-fd5e449ca40b" containerName="route-controller-manager" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831450 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b4d86a-9b94-4913-a35f-fd5e449ca40b" containerName="route-controller-manager" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831542 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerName="registry-server" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831555 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerName="registry-server" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831566 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" containerName="controller-manager" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831574 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b4d86a-9b94-4913-a35f-fd5e449ca40b" containerName="route-controller-manager" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831582 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce" containerName="oc" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831590 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa1ed4c8-e4bd-4352-bee3-404f16244ea3" containerName="oc" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.832045 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.834046 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w"] Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.834649 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.835212 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.836357 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.836542 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.837749 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.838069 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.838316 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.838479 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.838639 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.839435 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.841269 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.841505 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w"] Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.843026 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf"] Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.844418 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.844695 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.845063 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.881768 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-config\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.881853 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-config\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.881981 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h6km\" (UniqueName: \"kubernetes.io/projected/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-kube-api-access-9h6km\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.882012 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-client-ca\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.882040 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6mvk\" (UniqueName: \"kubernetes.io/projected/0561a31b-c67c-4410-8845-d47e4533be0a-kube-api-access-h6mvk\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.882061 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-serving-cert\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.882095 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-client-ca\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.882121 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0561a31b-c67c-4410-8845-d47e4533be0a-serving-cert\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.882138 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-proxy-ca-bundles\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983386 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-config\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983439 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h6km\" (UniqueName: \"kubernetes.io/projected/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-kube-api-access-9h6km\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983461 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-client-ca\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983484 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6mvk\" (UniqueName: \"kubernetes.io/projected/0561a31b-c67c-4410-8845-d47e4533be0a-kube-api-access-h6mvk\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983504 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-serving-cert\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983532 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-client-ca\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983554 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0561a31b-c67c-4410-8845-d47e4533be0a-serving-cert\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983569 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-proxy-ca-bundles\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983601 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-config\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.984687 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-client-ca\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.985074 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-config\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.985104 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-config\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.986073 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-proxy-ca-bundles\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.989295 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-client-ca\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.991659 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-serving-cert\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.996919 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0561a31b-c67c-4410-8845-d47e4533be0a-serving-cert\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.999784 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h6km\" (UniqueName: \"kubernetes.io/projected/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-kube-api-access-9h6km\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:15 crc kubenswrapper[4898]: I0313 14:01:15.017007 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6mvk\" (UniqueName: \"kubernetes.io/projected/0561a31b-c67c-4410-8845-d47e4533be0a-kube-api-access-h6mvk\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:15 crc kubenswrapper[4898]: I0313 14:01:15.201828 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:15 crc kubenswrapper[4898]: I0313 14:01:15.218254 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:15 crc kubenswrapper[4898]: I0313 14:01:15.466859 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf"] Mar 13 14:01:15 crc kubenswrapper[4898]: W0313 14:01:15.473069 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a72588b_5ed5_4ab7_bbe0_c0b6e08eedbf.slice/crio-ba22e3087795c8be037e21c8d47195aba889300c63f1adc91b0790f8732ad1d9 WatchSource:0}: Error finding container ba22e3087795c8be037e21c8d47195aba889300c63f1adc91b0790f8732ad1d9: Status 404 returned error can't find the container with id ba22e3087795c8be037e21c8d47195aba889300c63f1adc91b0790f8732ad1d9 Mar 13 14:01:15 crc kubenswrapper[4898]: I0313 14:01:15.509311 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w"] Mar 13 14:01:15 crc kubenswrapper[4898]: I0313 14:01:15.616346 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" event={"ID":"0561a31b-c67c-4410-8845-d47e4533be0a","Type":"ContainerStarted","Data":"3f35c6846af9efc90831b47db53412c7613bf81bc4a1bb77055f1c9c4645cef8"} Mar 13 14:01:15 crc kubenswrapper[4898]: I0313 14:01:15.618743 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" event={"ID":"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf","Type":"ContainerStarted","Data":"ba22e3087795c8be037e21c8d47195aba889300c63f1adc91b0790f8732ad1d9"} Mar 13 14:01:15 crc kubenswrapper[4898]: I0313 14:01:15.749411 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" path="/var/lib/kubelet/pods/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1/volumes" Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.286564 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-twh8h" Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.629711 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" event={"ID":"0561a31b-c67c-4410-8845-d47e4533be0a","Type":"ContainerStarted","Data":"11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97"} Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.630353 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.631383 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" event={"ID":"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf","Type":"ContainerStarted","Data":"1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989"} Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.631672 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.639213 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.640683 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.655553 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" podStartSLOduration=3.655529929 podStartE2EDuration="3.655529929s" podCreationTimestamp="2026-03-13 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:01:16.652672141 +0000 UTC m=+311.654260390" watchObservedRunningTime="2026-03-13 14:01:16.655529929 +0000 UTC m=+311.657118178" Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.668525 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ppq6v" Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.704093 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" podStartSLOduration=4.704073357 podStartE2EDuration="4.704073357s" podCreationTimestamp="2026-03-13 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:01:16.703292698 +0000 UTC m=+311.704880957" watchObservedRunningTime="2026-03-13 14:01:16.704073357 +0000 UTC m=+311.705661616" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.863293 4898 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.864336 4898 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.864474 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.864581 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140" gracePeriod=15 Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.864691 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364" gracePeriod=15 Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.864700 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9" gracePeriod=15 Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.864675 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026" gracePeriod=15 Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.864724 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3" gracePeriod=15 Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.865727 4898 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.865878 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.865891 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.865941 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.865974 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.865986 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.865994 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.866004 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.867729 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.867789 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.867797 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.867820 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.867827 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.867838 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.867845 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.867853 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.867859 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.867972 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.867984 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.867994 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.868001 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.868007 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.868015 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.868022 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.868110 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.868120 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.868133 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.868143 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.868224 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.868235 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.891552 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ppq6v"] Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.892016 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ppq6v" podUID="a990881e-0caf-4096-a372-4cdad69006c1" containerName="registry-server" containerID="cri-o://34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc" gracePeriod=2 Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.898991 4898 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.924320 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.924397 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.924418 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.924476 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.924526 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.924552 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.924587 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.924620 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026426 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026471 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026489 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026503 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026510 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026542 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026522 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026566 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026597 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026622 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026647 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026656 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026710 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026665 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026699 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026711 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.320865 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ppq6v" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.434738 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-utilities\") pod \"a990881e-0caf-4096-a372-4cdad69006c1\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.435058 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7vmd\" (UniqueName: \"kubernetes.io/projected/a990881e-0caf-4096-a372-4cdad69006c1-kube-api-access-x7vmd\") pod \"a990881e-0caf-4096-a372-4cdad69006c1\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.435134 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-catalog-content\") pod \"a990881e-0caf-4096-a372-4cdad69006c1\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.435570 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-utilities" (OuterVolumeSpecName: "utilities") pod "a990881e-0caf-4096-a372-4cdad69006c1" (UID: "a990881e-0caf-4096-a372-4cdad69006c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.445769 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a990881e-0caf-4096-a372-4cdad69006c1-kube-api-access-x7vmd" (OuterVolumeSpecName: "kube-api-access-x7vmd") pod "a990881e-0caf-4096-a372-4cdad69006c1" (UID: "a990881e-0caf-4096-a372-4cdad69006c1"). InnerVolumeSpecName "kube-api-access-x7vmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.497429 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a990881e-0caf-4096-a372-4cdad69006c1" (UID: "a990881e-0caf-4096-a372-4cdad69006c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.537041 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.537286 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7vmd\" (UniqueName: \"kubernetes.io/projected/a990881e-0caf-4096-a372-4cdad69006c1-kube-api-access-x7vmd\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.537376 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.644051 4898 generic.go:334] "Generic (PLEG): container finished" podID="77480be5-9488-434e-8105-0fc9237cae46" containerID="e6def40c01eb99d1e2a262735e66a81eae49e5b4d7cd72298170f434700dfefb" exitCode=0 Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.644168 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"77480be5-9488-434e-8105-0fc9237cae46","Type":"ContainerDied","Data":"e6def40c01eb99d1e2a262735e66a81eae49e5b4d7cd72298170f434700dfefb"} Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.647168 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.648803 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.649822 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9" exitCode=0 Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.649926 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.649945 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026" exitCode=0 Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.650113 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364" exitCode=0 Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.650217 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3" exitCode=2 Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.653631 4898 generic.go:334] "Generic (PLEG): container finished" podID="a990881e-0caf-4096-a372-4cdad69006c1" containerID="34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc" exitCode=0 Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.653733 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppq6v" event={"ID":"a990881e-0caf-4096-a372-4cdad69006c1","Type":"ContainerDied","Data":"34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc"} Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.653809 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppq6v" event={"ID":"a990881e-0caf-4096-a372-4cdad69006c1","Type":"ContainerDied","Data":"3acac09dab7fc6e01d8b6bf7a368fc3881544da372e2f3a95826c1fc007510c2"} Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.653865 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ppq6v" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.705486 4898 scope.go:117] "RemoveContainer" containerID="34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.727770 4898 scope.go:117] "RemoveContainer" containerID="4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.745674 4898 scope.go:117] "RemoveContainer" containerID="3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.766181 4898 scope.go:117] "RemoveContainer" containerID="34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc" Mar 13 14:01:18 crc kubenswrapper[4898]: E0313 14:01:18.766828 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc\": container with ID starting with 34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc not found: ID does not exist" containerID="34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.766876 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc"} err="failed to get container status \"34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc\": rpc error: code = NotFound desc = could not find container \"34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc\": container with ID starting with 34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc not found: ID does not exist" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.766964 4898 scope.go:117] "RemoveContainer" containerID="4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e" Mar 13 14:01:18 crc kubenswrapper[4898]: E0313 14:01:18.767423 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e\": container with ID starting with 4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e not found: ID does not exist" containerID="4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.767506 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e"} err="failed to get container status \"4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e\": rpc error: code = NotFound desc = could not find container \"4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e\": container with ID starting with 4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e not found: ID does not exist" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.767555 4898 scope.go:117] "RemoveContainer" containerID="3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263" Mar 13 14:01:18 crc kubenswrapper[4898]: E0313 14:01:18.768018 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263\": container with ID starting with 3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263 not found: ID does not exist" containerID="3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.768052 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263"} err="failed to get container status \"3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263\": rpc error: code = NotFound desc = could not find container \"3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263\": container with ID starting with 3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263 not found: ID does not exist" Mar 13 14:01:19 crc kubenswrapper[4898]: I0313 14:01:19.451882 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-974qp" Mar 13 14:01:19 crc kubenswrapper[4898]: I0313 14:01:19.505274 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-974qp" Mar 13 14:01:19 crc kubenswrapper[4898]: I0313 14:01:19.663846 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 14:01:19 crc kubenswrapper[4898]: I0313 14:01:19.860040 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 14:01:19 crc kubenswrapper[4898]: I0313 14:01:19.903036 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 14:01:19 crc kubenswrapper[4898]: I0313 14:01:19.983343 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.057584 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-kubelet-dir\") pod \"77480be5-9488-434e-8105-0fc9237cae46\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.057703 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-var-lock\") pod \"77480be5-9488-434e-8105-0fc9237cae46\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.057714 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "77480be5-9488-434e-8105-0fc9237cae46" (UID: "77480be5-9488-434e-8105-0fc9237cae46"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.057738 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77480be5-9488-434e-8105-0fc9237cae46-kube-api-access\") pod \"77480be5-9488-434e-8105-0fc9237cae46\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.057831 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-var-lock" (OuterVolumeSpecName: "var-lock") pod "77480be5-9488-434e-8105-0fc9237cae46" (UID: "77480be5-9488-434e-8105-0fc9237cae46"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.058142 4898 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.058165 4898 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.082209 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77480be5-9488-434e-8105-0fc9237cae46-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "77480be5-9488-434e-8105-0fc9237cae46" (UID: "77480be5-9488-434e-8105-0fc9237cae46"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.158718 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77480be5-9488-434e-8105-0fc9237cae46-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.236124 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.237087 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.360963 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.361003 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.361034 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.361116 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.361169 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.361176 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.361330 4898 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.361348 4898 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.361357 4898 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.673684 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.674326 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140" exitCode=0 Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.674412 4898 scope.go:117] "RemoveContainer" containerID="58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.674510 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.676519 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.676862 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"77480be5-9488-434e-8105-0fc9237cae46","Type":"ContainerDied","Data":"dd7ab09c8cacba65b6094ca53d6f8610851fabd3f291b2f1e2fc1acdbaf4b50f"} Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.676961 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd7ab09c8cacba65b6094ca53d6f8610851fabd3f291b2f1e2fc1acdbaf4b50f" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.692643 4898 scope.go:117] "RemoveContainer" containerID="c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.709719 4898 scope.go:117] "RemoveContainer" containerID="48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.726195 4898 scope.go:117] "RemoveContainer" containerID="c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.738296 4898 scope.go:117] "RemoveContainer" containerID="b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.751977 4898 scope.go:117] "RemoveContainer" containerID="5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.772796 4898 scope.go:117] "RemoveContainer" containerID="58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9" Mar 13 14:01:20 crc kubenswrapper[4898]: E0313 14:01:20.773427 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\": container with ID starting with 58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9 not found: ID does not exist" containerID="58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.773514 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9"} err="failed to get container status \"58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\": rpc error: code = NotFound desc = could not find container \"58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\": container with ID starting with 58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9 not found: ID does not exist" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.773547 4898 scope.go:117] "RemoveContainer" containerID="c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026" Mar 13 14:01:20 crc kubenswrapper[4898]: E0313 14:01:20.773981 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\": container with ID starting with c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026 not found: ID does not exist" containerID="c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.774012 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026"} err="failed to get container status \"c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\": rpc error: code = NotFound desc = could not find container \"c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\": container with ID starting with c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026 not found: ID does not exist" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.774040 4898 scope.go:117] "RemoveContainer" containerID="48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364" Mar 13 14:01:20 crc kubenswrapper[4898]: E0313 14:01:20.774366 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\": container with ID starting with 48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364 not found: ID does not exist" containerID="48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.774394 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364"} err="failed to get container status \"48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\": rpc error: code = NotFound desc = could not find container \"48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\": container with ID starting with 48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364 not found: ID does not exist" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.774411 4898 scope.go:117] "RemoveContainer" containerID="c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3" Mar 13 14:01:20 crc kubenswrapper[4898]: E0313 14:01:20.774709 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\": container with ID starting with c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3 not found: ID does not exist" containerID="c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.774741 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3"} err="failed to get container status \"c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\": rpc error: code = NotFound desc = could not find container \"c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\": container with ID starting with c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3 not found: ID does not exist" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.774759 4898 scope.go:117] "RemoveContainer" containerID="b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140" Mar 13 14:01:20 crc kubenswrapper[4898]: E0313 14:01:20.775120 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\": container with ID starting with b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140 not found: ID does not exist" containerID="b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.775146 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140"} err="failed to get container status \"b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\": rpc error: code = NotFound desc = could not find container \"b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\": container with ID starting with b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140 not found: ID does not exist" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.775163 4898 scope.go:117] "RemoveContainer" containerID="5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3" Mar 13 14:01:20 crc kubenswrapper[4898]: E0313 14:01:20.775453 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\": container with ID starting with 5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3 not found: ID does not exist" containerID="5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.775481 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3"} err="failed to get container status \"5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\": rpc error: code = NotFound desc = could not find container \"5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\": container with ID starting with 5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3 not found: ID does not exist" Mar 13 14:01:21 crc kubenswrapper[4898]: I0313 14:01:21.752411 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 13 14:01:22 crc kubenswrapper[4898]: E0313 14:01:22.743922 4898 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" volumeName="registry-storage" Mar 13 14:01:22 crc kubenswrapper[4898]: E0313 14:01:22.902569 4898 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:22 crc kubenswrapper[4898]: I0313 14:01:22.902467 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:22 crc kubenswrapper[4898]: I0313 14:01:22.903142 4898 status_manager.go:851] "Failed to get status for pod" podUID="77480be5-9488-434e-8105-0fc9237cae46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:22 crc kubenswrapper[4898]: I0313 14:01:22.903354 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:22 crc kubenswrapper[4898]: I0313 14:01:22.903352 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:22 crc kubenswrapper[4898]: I0313 14:01:22.903671 4898 status_manager.go:851] "Failed to get status for pod" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" pod="openshift-marketplace/redhat-operators-btkxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-btkxt\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:22 crc kubenswrapper[4898]: I0313 14:01:22.903878 4898 status_manager.go:851] "Failed to get status for pod" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" pod="openshift-marketplace/redhat-operators-974qp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-974qp\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:22 crc kubenswrapper[4898]: E0313 14:01:22.936572 4898 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.201:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c6b7034563f31 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 14:01:22.935709489 +0000 UTC m=+317.937297728,LastTimestamp:2026-03-13 14:01:22.935709489 +0000 UTC m=+317.937297728,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 14:01:23 crc kubenswrapper[4898]: I0313 14:01:23.698837 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d"} Mar 13 14:01:23 crc kubenswrapper[4898]: I0313 14:01:23.698893 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b1b09d1dcd3f368eefe1887da4bd6eca3d4544ca16ac1188de99a9ea197675f2"} Mar 13 14:01:23 crc kubenswrapper[4898]: E0313 14:01:23.699566 4898 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:23 crc kubenswrapper[4898]: I0313 14:01:23.699555 4898 status_manager.go:851] "Failed to get status for pod" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" pod="openshift-marketplace/redhat-operators-btkxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-btkxt\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:23 crc kubenswrapper[4898]: I0313 14:01:23.700082 4898 status_manager.go:851] "Failed to get status for pod" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" pod="openshift-marketplace/redhat-operators-974qp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-974qp\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:23 crc kubenswrapper[4898]: I0313 14:01:23.700408 4898 status_manager.go:851] "Failed to get status for pod" podUID="77480be5-9488-434e-8105-0fc9237cae46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:23 crc kubenswrapper[4898]: I0313 14:01:23.700816 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:23 crc kubenswrapper[4898]: I0313 14:01:23.846766 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" containerName="oauth-openshift" containerID="cri-o://e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886" gracePeriod=15 Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.412374 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.413017 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.413626 4898 status_manager.go:851] "Failed to get status for pod" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" pod="openshift-marketplace/redhat-operators-btkxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-btkxt\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.414381 4898 status_manager.go:851] "Failed to get status for pod" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" pod="openshift-marketplace/redhat-operators-974qp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-974qp\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.414820 4898 status_manager.go:851] "Failed to get status for pod" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-djn5q\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.415192 4898 status_manager.go:851] "Failed to get status for pod" podUID="77480be5-9488-434e-8105-0fc9237cae46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557355 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-cliconfig\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557422 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-provider-selection\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557445 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-policies\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557488 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-router-certs\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557534 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-session\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557571 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-serving-cert\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557610 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzgzk\" (UniqueName: \"kubernetes.io/projected/b26a4d77-f170-467e-ad96-4741cc5a8f23-kube-api-access-vzgzk\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557642 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-error\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557669 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-idp-0-file-data\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557698 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-login\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557747 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-service-ca\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557810 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-trusted-ca-bundle\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557845 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-ocp-branding-template\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557881 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-dir\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.558232 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.558430 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.558638 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.559333 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.559432 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.565527 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.566178 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.566159 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.566737 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.567126 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.567234 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b26a4d77-f170-467e-ad96-4741cc5a8f23-kube-api-access-vzgzk" (OuterVolumeSpecName: "kube-api-access-vzgzk") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "kube-api-access-vzgzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.567533 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.567738 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.568244 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.659834 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzgzk\" (UniqueName: \"kubernetes.io/projected/b26a4d77-f170-467e-ad96-4741cc5a8f23-kube-api-access-vzgzk\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.659921 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.659939 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.659953 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.659968 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.659982 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.659997 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.660012 4898 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.660028 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.660042 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.660056 4898 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.660067 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.660080 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.660093 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.708235 4898 generic.go:334] "Generic (PLEG): container finished" podID="b26a4d77-f170-467e-ad96-4741cc5a8f23" containerID="e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886" exitCode=0 Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.708297 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" event={"ID":"b26a4d77-f170-467e-ad96-4741cc5a8f23","Type":"ContainerDied","Data":"e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886"} Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.708335 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" event={"ID":"b26a4d77-f170-467e-ad96-4741cc5a8f23","Type":"ContainerDied","Data":"2d5714977afe363a0af3e9631742fe59289a173d68823730a2b889e9e03736c1"} Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.708357 4898 scope.go:117] "RemoveContainer" containerID="e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.708482 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.710229 4898 status_manager.go:851] "Failed to get status for pod" podUID="77480be5-9488-434e-8105-0fc9237cae46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.711567 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.711815 4898 status_manager.go:851] "Failed to get status for pod" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" pod="openshift-marketplace/redhat-operators-btkxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-btkxt\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.712059 4898 status_manager.go:851] "Failed to get status for pod" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" pod="openshift-marketplace/redhat-operators-974qp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-974qp\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.712341 4898 status_manager.go:851] "Failed to get status for pod" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-djn5q\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.725862 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.727448 4898 status_manager.go:851] "Failed to get status for pod" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" pod="openshift-marketplace/redhat-operators-btkxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-btkxt\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.728004 4898 status_manager.go:851] "Failed to get status for pod" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" pod="openshift-marketplace/redhat-operators-974qp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-974qp\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.728235 4898 status_manager.go:851] "Failed to get status for pod" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-djn5q\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.728452 4898 status_manager.go:851] "Failed to get status for pod" podUID="77480be5-9488-434e-8105-0fc9237cae46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.743933 4898 scope.go:117] "RemoveContainer" containerID="e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886" Mar 13 14:01:24 crc kubenswrapper[4898]: E0313 14:01:24.745185 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886\": container with ID starting with e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886 not found: ID does not exist" containerID="e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.745247 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886"} err="failed to get container status \"e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886\": rpc error: code = NotFound desc = could not find container \"e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886\": container with ID starting with e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886 not found: ID does not exist" Mar 13 14:01:25 crc kubenswrapper[4898]: I0313 14:01:25.743183 4898 status_manager.go:851] "Failed to get status for pod" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" pod="openshift-marketplace/redhat-operators-btkxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-btkxt\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:25 crc kubenswrapper[4898]: I0313 14:01:25.744327 4898 status_manager.go:851] "Failed to get status for pod" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" pod="openshift-marketplace/redhat-operators-974qp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-974qp\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:25 crc kubenswrapper[4898]: I0313 14:01:25.745026 4898 status_manager.go:851] "Failed to get status for pod" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-djn5q\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:25 crc kubenswrapper[4898]: I0313 14:01:25.745400 4898 status_manager.go:851] "Failed to get status for pod" podUID="77480be5-9488-434e-8105-0fc9237cae46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:25 crc kubenswrapper[4898]: I0313 14:01:25.745703 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:27 crc kubenswrapper[4898]: E0313 14:01:27.629248 4898 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:27 crc kubenswrapper[4898]: E0313 14:01:27.629813 4898 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:27 crc kubenswrapper[4898]: E0313 14:01:27.630556 4898 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:27 crc kubenswrapper[4898]: E0313 14:01:27.631083 4898 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:27 crc kubenswrapper[4898]: E0313 14:01:27.631529 4898 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:27 crc kubenswrapper[4898]: I0313 14:01:27.631579 4898 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 14:01:27 crc kubenswrapper[4898]: E0313 14:01:27.632046 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="200ms" Mar 13 14:01:27 crc kubenswrapper[4898]: E0313 14:01:27.832845 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="400ms" Mar 13 14:01:28 crc kubenswrapper[4898]: E0313 14:01:28.234344 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="800ms" Mar 13 14:01:28 crc kubenswrapper[4898]: E0313 14:01:28.746519 4898 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.201:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c6b7034563f31 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 14:01:22.935709489 +0000 UTC m=+317.937297728,LastTimestamp:2026-03-13 14:01:22.935709489 +0000 UTC m=+317.937297728,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 14:01:29 crc kubenswrapper[4898]: E0313 14:01:29.035610 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="1.6s" Mar 13 14:01:29 crc kubenswrapper[4898]: E0313 14:01:29.335506 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T14:01:29Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T14:01:29Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T14:01:29Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T14:01:29Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: E0313 14:01:29.335833 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: E0313 14:01:29.336093 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: E0313 14:01:29.336524 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: E0313 14:01:29.336773 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: E0313 14:01:29.336800 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.739637 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.740693 4898 status_manager.go:851] "Failed to get status for pod" podUID="77480be5-9488-434e-8105-0fc9237cae46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.741421 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.742342 4898 status_manager.go:851] "Failed to get status for pod" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" pod="openshift-marketplace/redhat-operators-btkxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-btkxt\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.742801 4898 status_manager.go:851] "Failed to get status for pod" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" pod="openshift-marketplace/redhat-operators-974qp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-974qp\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.743194 4898 status_manager.go:851] "Failed to get status for pod" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-djn5q\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.760394 4898 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.760452 4898 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:29 crc kubenswrapper[4898]: E0313 14:01:29.761017 4898 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.762147 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:30 crc kubenswrapper[4898]: E0313 14:01:30.636943 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="3.2s" Mar 13 14:01:30 crc kubenswrapper[4898]: I0313 14:01:30.761517 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fb8d1075677d946e836a8100a5ff549ecdb66cbebccbe047025d2252ca4d1e96"} Mar 13 14:01:30 crc kubenswrapper[4898]: I0313 14:01:30.761593 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ad7d9bc61eae3398db3cf7bee567629f2bad9723b89d2a1ef836f114247808c6"} Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.770283 4898 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="fb8d1075677d946e836a8100a5ff549ecdb66cbebccbe047025d2252ca4d1e96" exitCode=0 Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.770353 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"fb8d1075677d946e836a8100a5ff549ecdb66cbebccbe047025d2252ca4d1e96"} Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.770533 4898 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.770555 4898 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:31 crc kubenswrapper[4898]: E0313 14:01:31.770973 4898 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.771218 4898 status_manager.go:851] "Failed to get status for pod" podUID="77480be5-9488-434e-8105-0fc9237cae46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.771609 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.771971 4898 status_manager.go:851] "Failed to get status for pod" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" pod="openshift-marketplace/redhat-operators-btkxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-btkxt\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.772309 4898 status_manager.go:851] "Failed to get status for pod" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" pod="openshift-marketplace/redhat-operators-974qp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-974qp\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.772640 4898 status_manager.go:851] "Failed to get status for pod" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-djn5q\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:32 crc kubenswrapper[4898]: I0313 14:01:32.778637 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"05f1cfc7544c6b85c39b0f3a9bc14d3f94f3875fa7d3f313988de4b5619142b3"} Mar 13 14:01:32 crc kubenswrapper[4898]: I0313 14:01:32.778978 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"80ca95a7aa043adc2ed03e8f0aa22c3a939ff9e8c6eb6bf26e6933e81f9a0f76"} Mar 13 14:01:32 crc kubenswrapper[4898]: I0313 14:01:32.780798 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 14:01:32 crc kubenswrapper[4898]: I0313 14:01:32.782363 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 14:01:32 crc kubenswrapper[4898]: I0313 14:01:32.782450 4898 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec" exitCode=1 Mar 13 14:01:32 crc kubenswrapper[4898]: I0313 14:01:32.782505 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec"} Mar 13 14:01:32 crc kubenswrapper[4898]: I0313 14:01:32.783222 4898 scope.go:117] "RemoveContainer" containerID="172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec" Mar 13 14:01:33 crc kubenswrapper[4898]: I0313 14:01:33.175242 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 14:01:33 crc kubenswrapper[4898]: I0313 14:01:33.790716 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3a20b7cd1a0d1dae57c24d0daf2f93df192226c3651a5dd5b49224cd8f650f2f"} Mar 13 14:01:33 crc kubenswrapper[4898]: I0313 14:01:33.791027 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"948d353e949566c89a7588e2afad191c300fd7166df23f30754dedf178e39312"} Mar 13 14:01:33 crc kubenswrapper[4898]: I0313 14:01:33.794765 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 14:01:33 crc kubenswrapper[4898]: I0313 14:01:33.797804 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 14:01:33 crc kubenswrapper[4898]: I0313 14:01:33.797883 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b65cd02a017de53599582fdc93495c1971ff933ecacdf6af0171bad6070bff66"} Mar 13 14:01:34 crc kubenswrapper[4898]: I0313 14:01:34.808269 4898 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:34 crc kubenswrapper[4898]: I0313 14:01:34.808293 4898 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:34 crc kubenswrapper[4898]: I0313 14:01:34.808457 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9d6527d36e7b8f6344efc84197c8496d6674fb0cea7bd5e9fbfb3a398956bd7d"} Mar 13 14:01:34 crc kubenswrapper[4898]: I0313 14:01:34.808497 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:38 crc kubenswrapper[4898]: I0313 14:01:38.347553 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 14:01:39 crc kubenswrapper[4898]: I0313 14:01:39.231466 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 14:01:39 crc kubenswrapper[4898]: I0313 14:01:39.231687 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 13 14:01:39 crc kubenswrapper[4898]: I0313 14:01:39.231744 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 13 14:01:39 crc kubenswrapper[4898]: I0313 14:01:39.762654 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:39 crc kubenswrapper[4898]: I0313 14:01:39.762938 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:39 crc kubenswrapper[4898]: I0313 14:01:39.778184 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:39 crc kubenswrapper[4898]: I0313 14:01:39.821650 4898 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:40 crc kubenswrapper[4898]: I0313 14:01:40.850616 4898 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:40 crc kubenswrapper[4898]: I0313 14:01:40.851470 4898 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:40 crc kubenswrapper[4898]: I0313 14:01:40.855073 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:40 crc kubenswrapper[4898]: I0313 14:01:40.857488 4898 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="09f01934-8579-467a-a76f-1bfaffab04bf" Mar 13 14:01:41 crc kubenswrapper[4898]: I0313 14:01:41.858031 4898 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:41 crc kubenswrapper[4898]: I0313 14:01:41.859175 4898 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:45 crc kubenswrapper[4898]: I0313 14:01:45.767859 4898 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="09f01934-8579-467a-a76f-1bfaffab04bf" Mar 13 14:01:48 crc kubenswrapper[4898]: I0313 14:01:48.694106 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 14:01:49 crc kubenswrapper[4898]: I0313 14:01:49.145438 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 14:01:49 crc kubenswrapper[4898]: I0313 14:01:49.230706 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 14:01:49 crc kubenswrapper[4898]: I0313 14:01:49.232290 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 13 14:01:49 crc kubenswrapper[4898]: I0313 14:01:49.232364 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 13 14:01:49 crc kubenswrapper[4898]: I0313 14:01:49.633662 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.148744 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.242063 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.394287 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.504417 4898 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.510492 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-djn5q","openshift-marketplace/community-operators-ppq6v","openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.510567 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.519124 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.544187 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=11.544165934 podStartE2EDuration="11.544165934s" podCreationTimestamp="2026-03-13 14:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:01:50.542572356 +0000 UTC m=+345.544160625" watchObservedRunningTime="2026-03-13 14:01:50.544165934 +0000 UTC m=+345.545754233" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.679019 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.791446 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.844213 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.845548 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.137361 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.141366 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.383858 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6757584b5b-nct75"] Mar 13 14:01:51 crc kubenswrapper[4898]: E0313 14:01:51.384145 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a990881e-0caf-4096-a372-4cdad69006c1" containerName="extract-content" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384165 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a990881e-0caf-4096-a372-4cdad69006c1" containerName="extract-content" Mar 13 14:01:51 crc kubenswrapper[4898]: E0313 14:01:51.384201 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a990881e-0caf-4096-a372-4cdad69006c1" containerName="extract-utilities" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384208 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a990881e-0caf-4096-a372-4cdad69006c1" containerName="extract-utilities" Mar 13 14:01:51 crc kubenswrapper[4898]: E0313 14:01:51.384222 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77480be5-9488-434e-8105-0fc9237cae46" containerName="installer" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384230 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="77480be5-9488-434e-8105-0fc9237cae46" containerName="installer" Mar 13 14:01:51 crc kubenswrapper[4898]: E0313 14:01:51.384247 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" containerName="oauth-openshift" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384254 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" containerName="oauth-openshift" Mar 13 14:01:51 crc kubenswrapper[4898]: E0313 14:01:51.384265 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a990881e-0caf-4096-a372-4cdad69006c1" containerName="registry-server" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384272 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a990881e-0caf-4096-a372-4cdad69006c1" containerName="registry-server" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384378 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" containerName="oauth-openshift" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384393 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a990881e-0caf-4096-a372-4cdad69006c1" containerName="registry-server" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384405 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="77480be5-9488-434e-8105-0fc9237cae46" containerName="installer" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384840 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.387503 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.388151 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.388565 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.388976 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.389001 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.389148 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.389085 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.389620 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.390473 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.391812 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.392034 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.392963 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.409378 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.410392 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.417530 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426106 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426157 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426183 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-audit-policies\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426209 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-service-ca\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426271 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426297 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-audit-dir\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426332 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-login\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426420 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426459 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426510 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csv5s\" (UniqueName: \"kubernetes.io/projected/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-kube-api-access-csv5s\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426567 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-router-certs\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426638 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426684 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-session\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426706 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-error\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527436 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-audit-dir\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527484 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-login\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527508 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527526 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527544 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csv5s\" (UniqueName: \"kubernetes.io/projected/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-kube-api-access-csv5s\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527565 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-router-certs\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527584 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527571 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-audit-dir\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527605 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-session\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527718 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-error\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527794 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527841 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527886 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-audit-policies\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527997 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-service-ca\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.528065 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.529623 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.530082 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-service-ca\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.530348 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-audit-policies\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.531526 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.533177 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.533312 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-error\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.534025 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-session\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.534088 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-login\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.535620 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.537428 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.540634 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-router-certs\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.541502 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.551256 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csv5s\" (UniqueName: \"kubernetes.io/projected/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-kube-api-access-csv5s\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.679893 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.740051 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.755850 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a990881e-0caf-4096-a372-4cdad69006c1" path="/var/lib/kubelet/pods/a990881e-0caf-4096-a372-4cdad69006c1/volumes" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.757845 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" path="/var/lib/kubelet/pods/b26a4d77-f170-467e-ad96-4741cc5a8f23/volumes" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.841267 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.886669 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.912138 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.985317 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.178007 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.230910 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6757584b5b-nct75"] Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.311743 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.330890 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.357405 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.393460 4898 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.589017 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.668379 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.676769 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.745383 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.767308 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.780993 4898 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.801353 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.964595 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.980316 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.986960 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.026328 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.083327 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.123414 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.142336 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.366124 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.444396 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.540337 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.546186 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.605941 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.606353 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.648819 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.670514 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.671644 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.900622 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.023870 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.037091 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.037095 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.044454 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.070021 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.102720 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.118576 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.127487 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.200404 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.206166 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.211870 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.285716 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.299242 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.334490 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.355359 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.383286 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.457368 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.504311 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.526485 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.553528 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.569212 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.596067 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.690198 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.749689 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.826822 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.856214 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.868338 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.997381 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 14:01:54 crc kubenswrapper[4898]: E0313 14:01:54.998226 4898 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 14:01:54 crc kubenswrapper[4898]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6757584b5b-nct75_openshift-authentication_3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3_0(297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7): error adding pod openshift-authentication_oauth-openshift-6757584b5b-nct75 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7" Netns:"/var/run/netns/812cd27e-a312-4622-b089-bb187d413335" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6757584b5b-nct75;K8S_POD_INFRA_CONTAINER_ID=297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7;K8S_POD_UID=3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6757584b5b-nct75] networking: Multus: [openshift-authentication/oauth-openshift-6757584b5b-nct75/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6757584b5b-nct75 in out of cluster comm: pod "oauth-openshift-6757584b5b-nct75" not found Mar 13 14:01:54 crc kubenswrapper[4898]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 14:01:54 crc kubenswrapper[4898]: > Mar 13 14:01:54 crc kubenswrapper[4898]: E0313 14:01:54.998296 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 14:01:54 crc kubenswrapper[4898]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6757584b5b-nct75_openshift-authentication_3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3_0(297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7): error adding pod openshift-authentication_oauth-openshift-6757584b5b-nct75 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7" Netns:"/var/run/netns/812cd27e-a312-4622-b089-bb187d413335" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6757584b5b-nct75;K8S_POD_INFRA_CONTAINER_ID=297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7;K8S_POD_UID=3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6757584b5b-nct75] networking: Multus: [openshift-authentication/oauth-openshift-6757584b5b-nct75/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6757584b5b-nct75 in out of cluster comm: pod "oauth-openshift-6757584b5b-nct75" not found Mar 13 14:01:54 crc kubenswrapper[4898]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 14:01:54 crc kubenswrapper[4898]: > pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:54 crc kubenswrapper[4898]: E0313 14:01:54.998321 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 14:01:54 crc kubenswrapper[4898]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6757584b5b-nct75_openshift-authentication_3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3_0(297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7): error adding pod openshift-authentication_oauth-openshift-6757584b5b-nct75 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7" Netns:"/var/run/netns/812cd27e-a312-4622-b089-bb187d413335" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6757584b5b-nct75;K8S_POD_INFRA_CONTAINER_ID=297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7;K8S_POD_UID=3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6757584b5b-nct75] networking: Multus: [openshift-authentication/oauth-openshift-6757584b5b-nct75/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6757584b5b-nct75 in out of cluster comm: pod "oauth-openshift-6757584b5b-nct75" not found Mar 13 14:01:54 crc kubenswrapper[4898]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 14:01:54 crc kubenswrapper[4898]: > pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:54 crc kubenswrapper[4898]: E0313 14:01:54.998381 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-6757584b5b-nct75_openshift-authentication(3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-6757584b5b-nct75_openshift-authentication(3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6757584b5b-nct75_openshift-authentication_3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3_0(297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7): error adding pod openshift-authentication_oauth-openshift-6757584b5b-nct75 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7\\\" Netns:\\\"/var/run/netns/812cd27e-a312-4622-b089-bb187d413335\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6757584b5b-nct75;K8S_POD_INFRA_CONTAINER_ID=297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7;K8S_POD_UID=3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6757584b5b-nct75] networking: Multus: [openshift-authentication/oauth-openshift-6757584b5b-nct75/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6757584b5b-nct75 in out of cluster comm: pod \\\"oauth-openshift-6757584b5b-nct75\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" podUID="3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.063592 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.150130 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.185161 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.268515 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.297494 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.321849 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.342657 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.454113 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.500241 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.537517 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.671927 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.731175 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.773942 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.788156 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.805197 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.865663 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.940504 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.941073 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.088306 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.164133 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.195501 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.252855 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.300600 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.432360 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.474698 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.510105 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.556495 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.677670 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.759199 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.807574 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.902999 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.912227 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.102462 4898 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.165708 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.327931 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.501315 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.595126 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.620699 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.716061 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.724082 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.743233 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.781840 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.858287 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.878082 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.917827 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.942682 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.956767 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.997633 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.022179 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.037178 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.072705 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.184206 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.193854 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.225632 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.471167 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.560431 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.577492 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.619849 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.674319 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.688293 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.695959 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.803384 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.825136 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.874838 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.896450 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.978652 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.051679 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.135103 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 14:01:59 crc kubenswrapper[4898]: E0313 14:01:59.156948 4898 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 14:01:59 crc kubenswrapper[4898]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6757584b5b-nct75_openshift-authentication_3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3_0(22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0): error adding pod openshift-authentication_oauth-openshift-6757584b5b-nct75 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0" Netns:"/var/run/netns/9761714d-3bdd-4b3c-ad53-09e2640d19f7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6757584b5b-nct75;K8S_POD_INFRA_CONTAINER_ID=22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0;K8S_POD_UID=3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6757584b5b-nct75] networking: Multus: [openshift-authentication/oauth-openshift-6757584b5b-nct75/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6757584b5b-nct75 in out of cluster comm: pod "oauth-openshift-6757584b5b-nct75" not found Mar 13 14:01:59 crc kubenswrapper[4898]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 14:01:59 crc kubenswrapper[4898]: > Mar 13 14:01:59 crc kubenswrapper[4898]: E0313 14:01:59.157234 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 14:01:59 crc kubenswrapper[4898]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6757584b5b-nct75_openshift-authentication_3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3_0(22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0): error adding pod openshift-authentication_oauth-openshift-6757584b5b-nct75 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0" Netns:"/var/run/netns/9761714d-3bdd-4b3c-ad53-09e2640d19f7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6757584b5b-nct75;K8S_POD_INFRA_CONTAINER_ID=22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0;K8S_POD_UID=3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6757584b5b-nct75] networking: Multus: [openshift-authentication/oauth-openshift-6757584b5b-nct75/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6757584b5b-nct75 in out of cluster comm: pod "oauth-openshift-6757584b5b-nct75" not found Mar 13 14:01:59 crc kubenswrapper[4898]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 14:01:59 crc kubenswrapper[4898]: > pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:59 crc kubenswrapper[4898]: E0313 14:01:59.157262 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 14:01:59 crc kubenswrapper[4898]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6757584b5b-nct75_openshift-authentication_3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3_0(22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0): error adding pod openshift-authentication_oauth-openshift-6757584b5b-nct75 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0" Netns:"/var/run/netns/9761714d-3bdd-4b3c-ad53-09e2640d19f7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6757584b5b-nct75;K8S_POD_INFRA_CONTAINER_ID=22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0;K8S_POD_UID=3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6757584b5b-nct75] networking: Multus: [openshift-authentication/oauth-openshift-6757584b5b-nct75/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6757584b5b-nct75 in out of cluster comm: pod "oauth-openshift-6757584b5b-nct75" not found Mar 13 14:01:59 crc kubenswrapper[4898]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 14:01:59 crc kubenswrapper[4898]: > pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:59 crc kubenswrapper[4898]: E0313 14:01:59.157321 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-6757584b5b-nct75_openshift-authentication(3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-6757584b5b-nct75_openshift-authentication(3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6757584b5b-nct75_openshift-authentication_3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3_0(22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0): error adding pod openshift-authentication_oauth-openshift-6757584b5b-nct75 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0\\\" Netns:\\\"/var/run/netns/9761714d-3bdd-4b3c-ad53-09e2640d19f7\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6757584b5b-nct75;K8S_POD_INFRA_CONTAINER_ID=22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0;K8S_POD_UID=3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6757584b5b-nct75] networking: Multus: [openshift-authentication/oauth-openshift-6757584b5b-nct75/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6757584b5b-nct75 in out of cluster comm: pod \\\"oauth-openshift-6757584b5b-nct75\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" podUID="3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.182693 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.231473 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.231529 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.231570 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.232121 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"b65cd02a017de53599582fdc93495c1971ff933ecacdf6af0171bad6070bff66"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.232252 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://b65cd02a017de53599582fdc93495c1971ff933ecacdf6af0171bad6070bff66" gracePeriod=30 Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.272908 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.322067 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.341517 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.343775 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.391759 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.418045 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.431754 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.494566 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.523185 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.539325 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.683571 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.731316 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.762936 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.791947 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.879656 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.905194 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.912294 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.051186 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.054236 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.128795 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.132063 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.194009 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.223130 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.250638 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.260865 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.283168 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.305933 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.385224 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.486114 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.543946 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.696616 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.740479 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.789993 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.815758 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.820097 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.830963 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.845512 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.903664 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.922716 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.924979 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.954287 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.000439 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.016230 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.105960 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.116165 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.127966 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.222559 4898 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.252291 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.284482 4898 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.310779 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.318503 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.326091 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.341393 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.383989 4898 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.384343 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d" gracePeriod=5 Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.390368 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.540461 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.559643 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.569311 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.640802 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.657443 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.710055 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.811964 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.878759 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.103282 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.171877 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.205693 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.208609 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.238471 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.251588 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.291526 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.379841 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.451285 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.543318 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.638400 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.689229 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.740251 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.758155 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.925416 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.974037 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 14:02:03 crc kubenswrapper[4898]: I0313 14:02:03.096848 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 14:02:03 crc kubenswrapper[4898]: I0313 14:02:03.163354 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 14:02:03 crc kubenswrapper[4898]: I0313 14:02:03.282214 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 14:02:03 crc kubenswrapper[4898]: I0313 14:02:03.482452 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 14:02:03 crc kubenswrapper[4898]: I0313 14:02:03.486850 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 14:02:03 crc kubenswrapper[4898]: I0313 14:02:03.749253 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 14:02:04 crc kubenswrapper[4898]: I0313 14:02:04.177120 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 14:02:04 crc kubenswrapper[4898]: I0313 14:02:04.278221 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 14:02:04 crc kubenswrapper[4898]: I0313 14:02:04.700727 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 14:02:04 crc kubenswrapper[4898]: I0313 14:02:04.819675 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 14:02:05 crc kubenswrapper[4898]: I0313 14:02:05.544393 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 14:02:06 crc kubenswrapper[4898]: I0313 14:02:06.179960 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 14:02:06 crc kubenswrapper[4898]: I0313 14:02:06.509286 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 14:02:06 crc kubenswrapper[4898]: I0313 14:02:06.683288 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 14:02:06 crc kubenswrapper[4898]: I0313 14:02:06.970578 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 14:02:06 crc kubenswrapper[4898]: I0313 14:02:06.970665 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.000244 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.000561 4898 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d" exitCode=137 Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.000613 4898 scope.go:117] "RemoveContainer" containerID="ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.000619 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.014515 4898 scope.go:117] "RemoveContainer" containerID="ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d" Mar 13 14:02:07 crc kubenswrapper[4898]: E0313 14:02:07.014889 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d\": container with ID starting with ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d not found: ID does not exist" containerID="ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.014931 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d"} err="failed to get container status \"ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d\": rpc error: code = NotFound desc = could not find container \"ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d\": container with ID starting with ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d not found: ID does not exist" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.157646 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.157730 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.157797 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.157831 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.157854 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.157941 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.158004 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.158077 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.158084 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.158328 4898 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.158357 4898 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.158384 4898 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.158407 4898 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.168050 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.259594 4898 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.262563 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.303640 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.746912 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 13 14:02:12 crc kubenswrapper[4898]: I0313 14:02:12.739439 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:02:12 crc kubenswrapper[4898]: I0313 14:02:12.740476 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:02:13 crc kubenswrapper[4898]: I0313 14:02:13.015787 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6757584b5b-nct75"] Mar 13 14:02:13 crc kubenswrapper[4898]: I0313 14:02:13.037060 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" event={"ID":"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3","Type":"ContainerStarted","Data":"d0641b0799632094c4d4cd21e5e97b5b85d25b38c9ca90698da38dd1d008c2ad"} Mar 13 14:02:14 crc kubenswrapper[4898]: I0313 14:02:14.046781 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" event={"ID":"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3","Type":"ContainerStarted","Data":"2d8bff9d69ce0d19fe0744cd41bbe73c255e4ee58f95dce997a9b7c3ed1203e1"} Mar 13 14:02:14 crc kubenswrapper[4898]: I0313 14:02:14.047257 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:02:14 crc kubenswrapper[4898]: I0313 14:02:14.055420 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:02:14 crc kubenswrapper[4898]: I0313 14:02:14.079448 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" podStartSLOduration=76.079429502 podStartE2EDuration="1m16.079429502s" podCreationTimestamp="2026-03-13 14:00:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:02:14.07517648 +0000 UTC m=+369.076764819" watchObservedRunningTime="2026-03-13 14:02:14.079429502 +0000 UTC m=+369.081017761" Mar 13 14:02:27 crc kubenswrapper[4898]: I0313 14:02:27.145966 4898 generic.go:334] "Generic (PLEG): container finished" podID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerID="66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67" exitCode=0 Mar 13 14:02:27 crc kubenswrapper[4898]: I0313 14:02:27.146089 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" event={"ID":"0a78868f-1786-430d-8df8-18bb1c2019b3","Type":"ContainerDied","Data":"66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67"} Mar 13 14:02:27 crc kubenswrapper[4898]: I0313 14:02:27.147061 4898 scope.go:117] "RemoveContainer" containerID="66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67" Mar 13 14:02:28 crc kubenswrapper[4898]: I0313 14:02:28.153330 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" event={"ID":"0a78868f-1786-430d-8df8-18bb1c2019b3","Type":"ContainerStarted","Data":"fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24"} Mar 13 14:02:28 crc kubenswrapper[4898]: I0313 14:02:28.153936 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 14:02:28 crc kubenswrapper[4898]: I0313 14:02:28.161011 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 14:02:30 crc kubenswrapper[4898]: I0313 14:02:30.173623 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 13 14:02:30 crc kubenswrapper[4898]: I0313 14:02:30.175751 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 14:02:30 crc kubenswrapper[4898]: I0313 14:02:30.177249 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 14:02:30 crc kubenswrapper[4898]: I0313 14:02:30.177329 4898 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b65cd02a017de53599582fdc93495c1971ff933ecacdf6af0171bad6070bff66" exitCode=137 Mar 13 14:02:30 crc kubenswrapper[4898]: I0313 14:02:30.177387 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b65cd02a017de53599582fdc93495c1971ff933ecacdf6af0171bad6070bff66"} Mar 13 14:02:30 crc kubenswrapper[4898]: I0313 14:02:30.177445 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f3f7856b47908c691e419ef35da06276497b17de1d07f34af58226c123a50471"} Mar 13 14:02:30 crc kubenswrapper[4898]: I0313 14:02:30.177475 4898 scope.go:117] "RemoveContainer" containerID="172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec" Mar 13 14:02:31 crc kubenswrapper[4898]: I0313 14:02:31.185117 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 13 14:02:31 crc kubenswrapper[4898]: I0313 14:02:31.185953 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 14:02:38 crc kubenswrapper[4898]: I0313 14:02:38.347546 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 14:02:39 crc kubenswrapper[4898]: I0313 14:02:39.232946 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 14:02:39 crc kubenswrapper[4898]: I0313 14:02:39.243146 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 14:02:40 crc kubenswrapper[4898]: I0313 14:02:40.256761 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.578402 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556842-7h9s5"] Mar 13 14:02:47 crc kubenswrapper[4898]: E0313 14:02:47.578984 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.578995 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.579083 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.579417 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556842-7h9s5" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.581891 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.582315 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.582352 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.605605 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556842-7h9s5"] Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.640588 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w"] Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.640877 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" podUID="0561a31b-c67c-4410-8845-d47e4533be0a" containerName="route-controller-manager" containerID="cri-o://11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97" gracePeriod=30 Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.643548 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf"] Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.643751 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" podUID="7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" containerName="controller-manager" containerID="cri-o://1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989" gracePeriod=30 Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.756701 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkkd2\" (UniqueName: \"kubernetes.io/projected/8a9b9a59-64ad-4602-88da-91583ec126dc-kube-api-access-hkkd2\") pod \"auto-csr-approver-29556842-7h9s5\" (UID: \"8a9b9a59-64ad-4602-88da-91583ec126dc\") " pod="openshift-infra/auto-csr-approver-29556842-7h9s5" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.857604 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkkd2\" (UniqueName: \"kubernetes.io/projected/8a9b9a59-64ad-4602-88da-91583ec126dc-kube-api-access-hkkd2\") pod \"auto-csr-approver-29556842-7h9s5\" (UID: \"8a9b9a59-64ad-4602-88da-91583ec126dc\") " pod="openshift-infra/auto-csr-approver-29556842-7h9s5" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.879634 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkkd2\" (UniqueName: \"kubernetes.io/projected/8a9b9a59-64ad-4602-88da-91583ec126dc-kube-api-access-hkkd2\") pod \"auto-csr-approver-29556842-7h9s5\" (UID: \"8a9b9a59-64ad-4602-88da-91583ec126dc\") " pod="openshift-infra/auto-csr-approver-29556842-7h9s5" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.895880 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556842-7h9s5" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.109082 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.171834 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.261058 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-config\") pod \"0561a31b-c67c-4410-8845-d47e4533be0a\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.261105 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-client-ca\") pod \"0561a31b-c67c-4410-8845-d47e4533be0a\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.261148 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0561a31b-c67c-4410-8845-d47e4533be0a-serving-cert\") pod \"0561a31b-c67c-4410-8845-d47e4533be0a\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.261191 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6mvk\" (UniqueName: \"kubernetes.io/projected/0561a31b-c67c-4410-8845-d47e4533be0a-kube-api-access-h6mvk\") pod \"0561a31b-c67c-4410-8845-d47e4533be0a\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.261835 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-client-ca" (OuterVolumeSpecName: "client-ca") pod "0561a31b-c67c-4410-8845-d47e4533be0a" (UID: "0561a31b-c67c-4410-8845-d47e4533be0a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.262008 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-config" (OuterVolumeSpecName: "config") pod "0561a31b-c67c-4410-8845-d47e4533be0a" (UID: "0561a31b-c67c-4410-8845-d47e4533be0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.265566 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0561a31b-c67c-4410-8845-d47e4533be0a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0561a31b-c67c-4410-8845-d47e4533be0a" (UID: "0561a31b-c67c-4410-8845-d47e4533be0a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.265586 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0561a31b-c67c-4410-8845-d47e4533be0a-kube-api-access-h6mvk" (OuterVolumeSpecName: "kube-api-access-h6mvk") pod "0561a31b-c67c-4410-8845-d47e4533be0a" (UID: "0561a31b-c67c-4410-8845-d47e4533be0a"). InnerVolumeSpecName "kube-api-access-h6mvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.297008 4898 generic.go:334] "Generic (PLEG): container finished" podID="0561a31b-c67c-4410-8845-d47e4533be0a" containerID="11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97" exitCode=0 Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.297090 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.297091 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" event={"ID":"0561a31b-c67c-4410-8845-d47e4533be0a","Type":"ContainerDied","Data":"11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97"} Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.297209 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" event={"ID":"0561a31b-c67c-4410-8845-d47e4533be0a","Type":"ContainerDied","Data":"3f35c6846af9efc90831b47db53412c7613bf81bc4a1bb77055f1c9c4645cef8"} Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.297233 4898 scope.go:117] "RemoveContainer" containerID="11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.298356 4898 generic.go:334] "Generic (PLEG): container finished" podID="7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" containerID="1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989" exitCode=0 Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.298376 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" event={"ID":"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf","Type":"ContainerDied","Data":"1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989"} Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.298432 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" event={"ID":"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf","Type":"ContainerDied","Data":"ba22e3087795c8be037e21c8d47195aba889300c63f1adc91b0790f8732ad1d9"} Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.298472 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.314620 4898 scope.go:117] "RemoveContainer" containerID="11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97" Mar 13 14:02:48 crc kubenswrapper[4898]: E0313 14:02:48.314979 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97\": container with ID starting with 11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97 not found: ID does not exist" containerID="11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.315009 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97"} err="failed to get container status \"11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97\": rpc error: code = NotFound desc = could not find container \"11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97\": container with ID starting with 11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97 not found: ID does not exist" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.315029 4898 scope.go:117] "RemoveContainer" containerID="1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.322139 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w"] Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.334173 4898 scope.go:117] "RemoveContainer" containerID="1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.334479 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w"] Mar 13 14:02:48 crc kubenswrapper[4898]: E0313 14:02:48.334805 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989\": container with ID starting with 1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989 not found: ID does not exist" containerID="1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.334849 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989"} err="failed to get container status \"1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989\": rpc error: code = NotFound desc = could not find container \"1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989\": container with ID starting with 1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989 not found: ID does not exist" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.348675 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556842-7h9s5"] Mar 13 14:02:48 crc kubenswrapper[4898]: W0313 14:02:48.354091 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a9b9a59_64ad_4602_88da_91583ec126dc.slice/crio-4d2585911a8b2947eab5790240e88ed8015c03581cb3080f1739046921122426 WatchSource:0}: Error finding container 4d2585911a8b2947eab5790240e88ed8015c03581cb3080f1739046921122426: Status 404 returned error can't find the container with id 4d2585911a8b2947eab5790240e88ed8015c03581cb3080f1739046921122426 Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.363524 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-client-ca\") pod \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.364458 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-client-ca" (OuterVolumeSpecName: "client-ca") pod "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" (UID: "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.364470 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-proxy-ca-bundles\") pod \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.364574 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-serving-cert\") pod \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.364634 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h6km\" (UniqueName: \"kubernetes.io/projected/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-kube-api-access-9h6km\") pod \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.364682 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-config\") pod \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.365289 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.365317 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.365328 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0561a31b-c67c-4410-8845-d47e4533be0a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.365340 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.365351 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6mvk\" (UniqueName: \"kubernetes.io/projected/0561a31b-c67c-4410-8845-d47e4533be0a-kube-api-access-h6mvk\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.365396 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" (UID: "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.365465 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-config" (OuterVolumeSpecName: "config") pod "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" (UID: "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.368216 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-kube-api-access-9h6km" (OuterVolumeSpecName: "kube-api-access-9h6km") pod "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" (UID: "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf"). InnerVolumeSpecName "kube-api-access-9h6km". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.368565 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" (UID: "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.466546 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h6km\" (UniqueName: \"kubernetes.io/projected/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-kube-api-access-9h6km\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.466572 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.466581 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.466590 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.625049 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf"] Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.630849 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf"] Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.900003 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm"] Mar 13 14:02:48 crc kubenswrapper[4898]: E0313 14:02:48.900589 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0561a31b-c67c-4410-8845-d47e4533be0a" containerName="route-controller-manager" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.900666 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0561a31b-c67c-4410-8845-d47e4533be0a" containerName="route-controller-manager" Mar 13 14:02:48 crc kubenswrapper[4898]: E0313 14:02:48.900747 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" containerName="controller-manager" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.900803 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" containerName="controller-manager" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.900997 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0561a31b-c67c-4410-8845-d47e4533be0a" containerName="route-controller-manager" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.901130 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" containerName="controller-manager" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.901671 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.903252 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-d974n"] Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.903470 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.903687 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.904001 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.904153 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.908578 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.908782 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.908805 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.908632 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.909501 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.910034 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.910301 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.910522 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.911116 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.913595 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-d974n"] Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.918422 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm"] Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.948448 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971358 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/278b669f-19b0-49e8-9a35-d583ac818d86-serving-cert\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971409 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-config\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971501 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-client-ca\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971570 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24rjh\" (UniqueName: \"kubernetes.io/projected/278b669f-19b0-49e8-9a35-d583ac818d86-kube-api-access-24rjh\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971600 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f9de415-59e6-40a1-8392-827c617e2ce8-serving-cert\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971628 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-config\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971728 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkj5d\" (UniqueName: \"kubernetes.io/projected/3f9de415-59e6-40a1-8392-827c617e2ce8-kube-api-access-zkj5d\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971785 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-client-ca\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971815 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-proxy-ca-bundles\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.072809 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24rjh\" (UniqueName: \"kubernetes.io/projected/278b669f-19b0-49e8-9a35-d583ac818d86-kube-api-access-24rjh\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.072860 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f9de415-59e6-40a1-8392-827c617e2ce8-serving-cert\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.072888 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-config\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.072930 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkj5d\" (UniqueName: \"kubernetes.io/projected/3f9de415-59e6-40a1-8392-827c617e2ce8-kube-api-access-zkj5d\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.072954 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-client-ca\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.072995 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-proxy-ca-bundles\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.073026 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/278b669f-19b0-49e8-9a35-d583ac818d86-serving-cert\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.073043 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-config\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.073070 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-client-ca\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.074397 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-client-ca\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.074883 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-config\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.075250 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-config\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.075986 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-client-ca\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.076009 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-proxy-ca-bundles\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.078529 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/278b669f-19b0-49e8-9a35-d583ac818d86-serving-cert\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.078725 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f9de415-59e6-40a1-8392-827c617e2ce8-serving-cert\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.088137 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24rjh\" (UniqueName: \"kubernetes.io/projected/278b669f-19b0-49e8-9a35-d583ac818d86-kube-api-access-24rjh\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.103856 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkj5d\" (UniqueName: \"kubernetes.io/projected/3f9de415-59e6-40a1-8392-827c617e2ce8-kube-api-access-zkj5d\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.261589 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.271001 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.309139 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556842-7h9s5" event={"ID":"8a9b9a59-64ad-4602-88da-91583ec126dc","Type":"ContainerStarted","Data":"4d2585911a8b2947eab5790240e88ed8015c03581cb3080f1739046921122426"} Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.683019 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-d974n"] Mar 13 14:02:49 crc kubenswrapper[4898]: W0313 14:02:49.684866 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f9de415_59e6_40a1_8392_827c617e2ce8.slice/crio-48528b351d9c8fb7ddf3057320b84878a31c1815533952151ea60cfed48b3a84 WatchSource:0}: Error finding container 48528b351d9c8fb7ddf3057320b84878a31c1815533952151ea60cfed48b3a84: Status 404 returned error can't find the container with id 48528b351d9c8fb7ddf3057320b84878a31c1815533952151ea60cfed48b3a84 Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.747124 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0561a31b-c67c-4410-8845-d47e4533be0a" path="/var/lib/kubelet/pods/0561a31b-c67c-4410-8845-d47e4533be0a/volumes" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.747997 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" path="/var/lib/kubelet/pods/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf/volumes" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.748528 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm"] Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.315378 4898 generic.go:334] "Generic (PLEG): container finished" podID="8a9b9a59-64ad-4602-88da-91583ec126dc" containerID="529194ce4ac0e19d00515e6fc6f6984803e8a03afdee3263ba4e434f1a13a57b" exitCode=0 Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.315711 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556842-7h9s5" event={"ID":"8a9b9a59-64ad-4602-88da-91583ec126dc","Type":"ContainerDied","Data":"529194ce4ac0e19d00515e6fc6f6984803e8a03afdee3263ba4e434f1a13a57b"} Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.317058 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" event={"ID":"3f9de415-59e6-40a1-8392-827c617e2ce8","Type":"ContainerStarted","Data":"1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1"} Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.317093 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" event={"ID":"3f9de415-59e6-40a1-8392-827c617e2ce8","Type":"ContainerStarted","Data":"48528b351d9c8fb7ddf3057320b84878a31c1815533952151ea60cfed48b3a84"} Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.317797 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.318856 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" event={"ID":"278b669f-19b0-49e8-9a35-d583ac818d86","Type":"ContainerStarted","Data":"4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9"} Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.318876 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" event={"ID":"278b669f-19b0-49e8-9a35-d583ac818d86","Type":"ContainerStarted","Data":"92accbb2ec5b0ce2cc04a20381f279e22f42b2968b782d019c7ef8631795f69d"} Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.319299 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.322033 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.347327 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" podStartSLOduration=3.347307555 podStartE2EDuration="3.347307555s" podCreationTimestamp="2026-03-13 14:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:02:50.343141141 +0000 UTC m=+405.344729400" watchObservedRunningTime="2026-03-13 14:02:50.347307555 +0000 UTC m=+405.348895794" Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.370311 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" podStartSLOduration=3.370294993 podStartE2EDuration="3.370294993s" podCreationTimestamp="2026-03-13 14:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:02:50.368667762 +0000 UTC m=+405.370256011" watchObservedRunningTime="2026-03-13 14:02:50.370294993 +0000 UTC m=+405.371883232" Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.561796 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:51 crc kubenswrapper[4898]: I0313 14:02:51.635140 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556842-7h9s5" Mar 13 14:02:51 crc kubenswrapper[4898]: I0313 14:02:51.806169 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkkd2\" (UniqueName: \"kubernetes.io/projected/8a9b9a59-64ad-4602-88da-91583ec126dc-kube-api-access-hkkd2\") pod \"8a9b9a59-64ad-4602-88da-91583ec126dc\" (UID: \"8a9b9a59-64ad-4602-88da-91583ec126dc\") " Mar 13 14:02:51 crc kubenswrapper[4898]: I0313 14:02:51.813655 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a9b9a59-64ad-4602-88da-91583ec126dc-kube-api-access-hkkd2" (OuterVolumeSpecName: "kube-api-access-hkkd2") pod "8a9b9a59-64ad-4602-88da-91583ec126dc" (UID: "8a9b9a59-64ad-4602-88da-91583ec126dc"). InnerVolumeSpecName "kube-api-access-hkkd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:02:51 crc kubenswrapper[4898]: I0313 14:02:51.907764 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkkd2\" (UniqueName: \"kubernetes.io/projected/8a9b9a59-64ad-4602-88da-91583ec126dc-kube-api-access-hkkd2\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:52 crc kubenswrapper[4898]: I0313 14:02:52.334959 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556842-7h9s5" event={"ID":"8a9b9a59-64ad-4602-88da-91583ec126dc","Type":"ContainerDied","Data":"4d2585911a8b2947eab5790240e88ed8015c03581cb3080f1739046921122426"} Mar 13 14:02:52 crc kubenswrapper[4898]: I0313 14:02:52.335045 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556842-7h9s5" Mar 13 14:02:52 crc kubenswrapper[4898]: I0313 14:02:52.335060 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d2585911a8b2947eab5790240e88ed8015c03581cb3080f1739046921122426" Mar 13 14:03:12 crc kubenswrapper[4898]: I0313 14:03:12.913215 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-d974n"] Mar 13 14:03:12 crc kubenswrapper[4898]: I0313 14:03:12.914027 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" podUID="3f9de415-59e6-40a1-8392-827c617e2ce8" containerName="controller-manager" containerID="cri-o://1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1" gracePeriod=30 Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.415112 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.458688 4898 generic.go:334] "Generic (PLEG): container finished" podID="3f9de415-59e6-40a1-8392-827c617e2ce8" containerID="1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1" exitCode=0 Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.458740 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" event={"ID":"3f9de415-59e6-40a1-8392-827c617e2ce8","Type":"ContainerDied","Data":"1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1"} Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.458776 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" event={"ID":"3f9de415-59e6-40a1-8392-827c617e2ce8","Type":"ContainerDied","Data":"48528b351d9c8fb7ddf3057320b84878a31c1815533952151ea60cfed48b3a84"} Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.458786 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.458800 4898 scope.go:117] "RemoveContainer" containerID="1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.476675 4898 scope.go:117] "RemoveContainer" containerID="1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1" Mar 13 14:03:13 crc kubenswrapper[4898]: E0313 14:03:13.477332 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1\": container with ID starting with 1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1 not found: ID does not exist" containerID="1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.477379 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1"} err="failed to get container status \"1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1\": rpc error: code = NotFound desc = could not find container \"1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1\": container with ID starting with 1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1 not found: ID does not exist" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.586008 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-client-ca\") pod \"3f9de415-59e6-40a1-8392-827c617e2ce8\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.586093 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkj5d\" (UniqueName: \"kubernetes.io/projected/3f9de415-59e6-40a1-8392-827c617e2ce8-kube-api-access-zkj5d\") pod \"3f9de415-59e6-40a1-8392-827c617e2ce8\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.586152 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-config\") pod \"3f9de415-59e6-40a1-8392-827c617e2ce8\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.586215 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-proxy-ca-bundles\") pod \"3f9de415-59e6-40a1-8392-827c617e2ce8\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.586261 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f9de415-59e6-40a1-8392-827c617e2ce8-serving-cert\") pod \"3f9de415-59e6-40a1-8392-827c617e2ce8\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.587015 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3f9de415-59e6-40a1-8392-827c617e2ce8" (UID: "3f9de415-59e6-40a1-8392-827c617e2ce8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.587045 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-config" (OuterVolumeSpecName: "config") pod "3f9de415-59e6-40a1-8392-827c617e2ce8" (UID: "3f9de415-59e6-40a1-8392-827c617e2ce8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.587436 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-client-ca" (OuterVolumeSpecName: "client-ca") pod "3f9de415-59e6-40a1-8392-827c617e2ce8" (UID: "3f9de415-59e6-40a1-8392-827c617e2ce8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.593541 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9de415-59e6-40a1-8392-827c617e2ce8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3f9de415-59e6-40a1-8392-827c617e2ce8" (UID: "3f9de415-59e6-40a1-8392-827c617e2ce8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.594112 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f9de415-59e6-40a1-8392-827c617e2ce8-kube-api-access-zkj5d" (OuterVolumeSpecName: "kube-api-access-zkj5d") pod "3f9de415-59e6-40a1-8392-827c617e2ce8" (UID: "3f9de415-59e6-40a1-8392-827c617e2ce8"). InnerVolumeSpecName "kube-api-access-zkj5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.687262 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.687298 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f9de415-59e6-40a1-8392-827c617e2ce8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.687310 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.687325 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkj5d\" (UniqueName: \"kubernetes.io/projected/3f9de415-59e6-40a1-8392-827c617e2ce8-kube-api-access-zkj5d\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.687339 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.787157 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-d974n"] Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.792057 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-d974n"] Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.927600 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-797d9c85c-m5jdj"] Mar 13 14:03:14 crc kubenswrapper[4898]: E0313 14:03:14.927875 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9de415-59e6-40a1-8392-827c617e2ce8" containerName="controller-manager" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.927915 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9de415-59e6-40a1-8392-827c617e2ce8" containerName="controller-manager" Mar 13 14:03:14 crc kubenswrapper[4898]: E0313 14:03:14.927938 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9b9a59-64ad-4602-88da-91583ec126dc" containerName="oc" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.927949 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9b9a59-64ad-4602-88da-91583ec126dc" containerName="oc" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.928095 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9b9a59-64ad-4602-88da-91583ec126dc" containerName="oc" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.928112 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f9de415-59e6-40a1-8392-827c617e2ce8" containerName="controller-manager" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.928581 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.931022 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.931236 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.931459 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.934603 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.934844 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.934844 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.940886 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.963923 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-797d9c85c-m5jdj"] Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.004617 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-client-ca\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.004679 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s75w9\" (UniqueName: \"kubernetes.io/projected/2ec8c09e-475e-4c4b-86ec-38388754240f-kube-api-access-s75w9\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.004944 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-config\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.005117 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec8c09e-475e-4c4b-86ec-38388754240f-serving-cert\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.005213 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-proxy-ca-bundles\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.105594 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec8c09e-475e-4c4b-86ec-38388754240f-serving-cert\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.105663 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-proxy-ca-bundles\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.105710 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-client-ca\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.106986 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s75w9\" (UniqueName: \"kubernetes.io/projected/2ec8c09e-475e-4c4b-86ec-38388754240f-kube-api-access-s75w9\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.107100 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-client-ca\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.107211 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-config\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.107275 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-proxy-ca-bundles\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.108644 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-config\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.119054 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec8c09e-475e-4c4b-86ec-38388754240f-serving-cert\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.121268 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s75w9\" (UniqueName: \"kubernetes.io/projected/2ec8c09e-475e-4c4b-86ec-38388754240f-kube-api-access-s75w9\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.257075 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.723319 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-797d9c85c-m5jdj"] Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.750609 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f9de415-59e6-40a1-8392-827c617e2ce8" path="/var/lib/kubelet/pods/3f9de415-59e6-40a1-8392-827c617e2ce8/volumes" Mar 13 14:03:16 crc kubenswrapper[4898]: I0313 14:03:16.508777 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" event={"ID":"2ec8c09e-475e-4c4b-86ec-38388754240f","Type":"ContainerStarted","Data":"ea168832255bbc86d5c5ce93816cc05501f0eb11f90b0102b5e3bb1a0fe8d967"} Mar 13 14:03:16 crc kubenswrapper[4898]: I0313 14:03:16.509175 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" event={"ID":"2ec8c09e-475e-4c4b-86ec-38388754240f","Type":"ContainerStarted","Data":"a5d66ae112a6b6960f8bede4d5c8a63d427f14d3ef98e541dc2b7652217eac67"} Mar 13 14:03:16 crc kubenswrapper[4898]: I0313 14:03:16.509206 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:16 crc kubenswrapper[4898]: I0313 14:03:16.513763 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:16 crc kubenswrapper[4898]: I0313 14:03:16.534134 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" podStartSLOduration=4.534104613 podStartE2EDuration="4.534104613s" podCreationTimestamp="2026-03-13 14:03:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:03:16.528268427 +0000 UTC m=+431.529856676" watchObservedRunningTime="2026-03-13 14:03:16.534104613 +0000 UTC m=+431.535692862" Mar 13 14:03:19 crc kubenswrapper[4898]: I0313 14:03:19.134303 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:03:19 crc kubenswrapper[4898]: I0313 14:03:19.134991 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.121507 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7zcdz"] Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.123204 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.136509 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7zcdz"] Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.180285 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-registry-tls\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.180353 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-bound-sa-token\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.180418 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ba480ebb-f079-4888-857b-d917e4a9b13b-registry-certificates\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.180456 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba480ebb-f079-4888-857b-d917e4a9b13b-trusted-ca\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.180486 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-655ml\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-kube-api-access-655ml\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.180515 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ba480ebb-f079-4888-857b-d917e4a9b13b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.180549 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ba480ebb-f079-4888-857b-d917e4a9b13b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.180762 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.212989 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.281675 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-registry-tls\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.281735 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-bound-sa-token\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.281787 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ba480ebb-f079-4888-857b-d917e4a9b13b-registry-certificates\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.281825 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba480ebb-f079-4888-857b-d917e4a9b13b-trusted-ca\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.281856 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-655ml\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-kube-api-access-655ml\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.281884 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ba480ebb-f079-4888-857b-d917e4a9b13b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.281949 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ba480ebb-f079-4888-857b-d917e4a9b13b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.282673 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ba480ebb-f079-4888-857b-d917e4a9b13b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.284123 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba480ebb-f079-4888-857b-d917e4a9b13b-trusted-ca\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.284226 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ba480ebb-f079-4888-857b-d917e4a9b13b-registry-certificates\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.287546 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ba480ebb-f079-4888-857b-d917e4a9b13b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.290527 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-registry-tls\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.301350 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-655ml\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-kube-api-access-655ml\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.305650 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-bound-sa-token\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.460057 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.880765 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7zcdz"] Mar 13 14:03:21 crc kubenswrapper[4898]: I0313 14:03:21.542251 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" event={"ID":"ba480ebb-f079-4888-857b-d917e4a9b13b","Type":"ContainerStarted","Data":"aa888781f304aed2b9c49d6608ce2ba009cb134b7b2da2e3ed2d684daedda363"} Mar 13 14:03:21 crc kubenswrapper[4898]: I0313 14:03:21.542302 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" event={"ID":"ba480ebb-f079-4888-857b-d917e4a9b13b","Type":"ContainerStarted","Data":"7ba354dc972ae3e18545d089f815f3751645634c4a62217a7c6f39c3f9c1fc0c"} Mar 13 14:03:21 crc kubenswrapper[4898]: I0313 14:03:21.542426 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:21 crc kubenswrapper[4898]: I0313 14:03:21.560698 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" podStartSLOduration=1.560674404 podStartE2EDuration="1.560674404s" podCreationTimestamp="2026-03-13 14:03:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:03:21.558020698 +0000 UTC m=+436.559608937" watchObservedRunningTime="2026-03-13 14:03:21.560674404 +0000 UTC m=+436.562262663" Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.327636 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-btkxt"] Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.328804 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-btkxt" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="registry-server" containerID="cri-o://72013575671e67bc50654bdc19c7f86358a2ef9f58c688c055736ec45a15a182" gracePeriod=2 Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.621532 4898 generic.go:334] "Generic (PLEG): container finished" podID="7794a943-5fec-485e-86bf-f104ed6ae070" containerID="72013575671e67bc50654bdc19c7f86358a2ef9f58c688c055736ec45a15a182" exitCode=0 Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.621632 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-btkxt" event={"ID":"7794a943-5fec-485e-86bf-f104ed6ae070","Type":"ContainerDied","Data":"72013575671e67bc50654bdc19c7f86358a2ef9f58c688c055736ec45a15a182"} Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.846420 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.933589 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm"] Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.934144 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" podUID="278b669f-19b0-49e8-9a35-d583ac818d86" containerName="route-controller-manager" containerID="cri-o://4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9" gracePeriod=30 Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.973485 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66sn5\" (UniqueName: \"kubernetes.io/projected/7794a943-5fec-485e-86bf-f104ed6ae070-kube-api-access-66sn5\") pod \"7794a943-5fec-485e-86bf-f104ed6ae070\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.973538 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-catalog-content\") pod \"7794a943-5fec-485e-86bf-f104ed6ae070\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.973592 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-utilities\") pod \"7794a943-5fec-485e-86bf-f104ed6ae070\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.974817 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-utilities" (OuterVolumeSpecName: "utilities") pod "7794a943-5fec-485e-86bf-f104ed6ae070" (UID: "7794a943-5fec-485e-86bf-f104ed6ae070"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.982061 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7794a943-5fec-485e-86bf-f104ed6ae070-kube-api-access-66sn5" (OuterVolumeSpecName: "kube-api-access-66sn5") pod "7794a943-5fec-485e-86bf-f104ed6ae070" (UID: "7794a943-5fec-485e-86bf-f104ed6ae070"). InnerVolumeSpecName "kube-api-access-66sn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.074638 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66sn5\" (UniqueName: \"kubernetes.io/projected/7794a943-5fec-485e-86bf-f104ed6ae070-kube-api-access-66sn5\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.074925 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.120163 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7794a943-5fec-485e-86bf-f104ed6ae070" (UID: "7794a943-5fec-485e-86bf-f104ed6ae070"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.176049 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.537156 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.628844 4898 generic.go:334] "Generic (PLEG): container finished" podID="278b669f-19b0-49e8-9a35-d583ac818d86" containerID="4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9" exitCode=0 Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.628934 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.628953 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" event={"ID":"278b669f-19b0-49e8-9a35-d583ac818d86","Type":"ContainerDied","Data":"4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9"} Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.629115 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" event={"ID":"278b669f-19b0-49e8-9a35-d583ac818d86","Type":"ContainerDied","Data":"92accbb2ec5b0ce2cc04a20381f279e22f42b2968b782d019c7ef8631795f69d"} Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.629146 4898 scope.go:117] "RemoveContainer" containerID="4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.632141 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-btkxt" event={"ID":"7794a943-5fec-485e-86bf-f104ed6ae070","Type":"ContainerDied","Data":"9107b24c316c7c4b1a47858e8c7bbd33ee4b48e192b020a209e38129a6fd6f89"} Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.632248 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.650532 4898 scope.go:117] "RemoveContainer" containerID="4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9" Mar 13 14:03:33 crc kubenswrapper[4898]: E0313 14:03:33.652468 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9\": container with ID starting with 4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9 not found: ID does not exist" containerID="4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.652522 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9"} err="failed to get container status \"4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9\": rpc error: code = NotFound desc = could not find container \"4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9\": container with ID starting with 4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9 not found: ID does not exist" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.652556 4898 scope.go:117] "RemoveContainer" containerID="72013575671e67bc50654bdc19c7f86358a2ef9f58c688c055736ec45a15a182" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.670813 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-btkxt"] Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.675283 4898 scope.go:117] "RemoveContainer" containerID="67b3d784e0ee63e0bd8e175cf0b8537e1ca35f7833ddbd4c0468016c4030500b" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.676163 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-btkxt"] Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.681359 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-config\") pod \"278b669f-19b0-49e8-9a35-d583ac818d86\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.681413 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-client-ca\") pod \"278b669f-19b0-49e8-9a35-d583ac818d86\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.681470 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24rjh\" (UniqueName: \"kubernetes.io/projected/278b669f-19b0-49e8-9a35-d583ac818d86-kube-api-access-24rjh\") pod \"278b669f-19b0-49e8-9a35-d583ac818d86\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.681518 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/278b669f-19b0-49e8-9a35-d583ac818d86-serving-cert\") pod \"278b669f-19b0-49e8-9a35-d583ac818d86\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.682752 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-client-ca" (OuterVolumeSpecName: "client-ca") pod "278b669f-19b0-49e8-9a35-d583ac818d86" (UID: "278b669f-19b0-49e8-9a35-d583ac818d86"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.682874 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-config" (OuterVolumeSpecName: "config") pod "278b669f-19b0-49e8-9a35-d583ac818d86" (UID: "278b669f-19b0-49e8-9a35-d583ac818d86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.689730 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278b669f-19b0-49e8-9a35-d583ac818d86-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "278b669f-19b0-49e8-9a35-d583ac818d86" (UID: "278b669f-19b0-49e8-9a35-d583ac818d86"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.689770 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278b669f-19b0-49e8-9a35-d583ac818d86-kube-api-access-24rjh" (OuterVolumeSpecName: "kube-api-access-24rjh") pod "278b669f-19b0-49e8-9a35-d583ac818d86" (UID: "278b669f-19b0-49e8-9a35-d583ac818d86"). InnerVolumeSpecName "kube-api-access-24rjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.701692 4898 scope.go:117] "RemoveContainer" containerID="b05ab9f2e4156ffc544ae5bf8d297fc15f45604caf9f75b4b1a59e033d78a2fc" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.753112 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" path="/var/lib/kubelet/pods/7794a943-5fec-485e-86bf-f104ed6ae070/volumes" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.783998 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.784066 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.784093 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24rjh\" (UniqueName: \"kubernetes.io/projected/278b669f-19b0-49e8-9a35-d583ac818d86-kube-api-access-24rjh\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.784120 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/278b669f-19b0-49e8-9a35-d583ac818d86-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.954985 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm"] Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.968055 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm"] Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.937649 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2"] Mar 13 14:03:34 crc kubenswrapper[4898]: E0313 14:03:34.937960 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278b669f-19b0-49e8-9a35-d583ac818d86" containerName="route-controller-manager" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.937980 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="278b669f-19b0-49e8-9a35-d583ac818d86" containerName="route-controller-manager" Mar 13 14:03:34 crc kubenswrapper[4898]: E0313 14:03:34.938000 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="extract-utilities" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.938010 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="extract-utilities" Mar 13 14:03:34 crc kubenswrapper[4898]: E0313 14:03:34.938028 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="extract-content" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.938040 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="extract-content" Mar 13 14:03:34 crc kubenswrapper[4898]: E0313 14:03:34.938065 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="registry-server" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.938075 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="registry-server" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.938220 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="278b669f-19b0-49e8-9a35-d583ac818d86" containerName="route-controller-manager" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.938245 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="registry-server" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.938729 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.940693 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.940763 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.941622 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.942938 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.943128 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.943443 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.958095 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2"] Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.095203 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-client-ca\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.095319 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-config\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.095406 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-serving-cert\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.095447 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9m7b\" (UniqueName: \"kubernetes.io/projected/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-kube-api-access-t9m7b\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.196117 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-serving-cert\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.196157 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9m7b\" (UniqueName: \"kubernetes.io/projected/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-kube-api-access-t9m7b\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.196189 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-client-ca\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.196221 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-config\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.197421 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-client-ca\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.198302 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-config\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.201685 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-serving-cert\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.212242 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9m7b\" (UniqueName: \"kubernetes.io/projected/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-kube-api-access-t9m7b\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.402476 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.747278 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="278b669f-19b0-49e8-9a35-d583ac818d86" path="/var/lib/kubelet/pods/278b669f-19b0-49e8-9a35-d583ac818d86/volumes" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.826768 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2"] Mar 13 14:03:35 crc kubenswrapper[4898]: W0313 14:03:35.837577 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0376a3d3_f3a2_4674_a7f9_b06a9e62836e.slice/crio-c87f84f9f27a2d6cb574908c3031e9c0f2914a96f5af157028432a387025a4a0 WatchSource:0}: Error finding container c87f84f9f27a2d6cb574908c3031e9c0f2914a96f5af157028432a387025a4a0: Status 404 returned error can't find the container with id c87f84f9f27a2d6cb574908c3031e9c0f2914a96f5af157028432a387025a4a0 Mar 13 14:03:36 crc kubenswrapper[4898]: I0313 14:03:36.655001 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" event={"ID":"0376a3d3-f3a2-4674-a7f9-b06a9e62836e","Type":"ContainerStarted","Data":"0292f4db3076b00a4ff8a067c3c4184374170598a35c90418442c7305e929d8e"} Mar 13 14:03:36 crc kubenswrapper[4898]: I0313 14:03:36.655050 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" event={"ID":"0376a3d3-f3a2-4674-a7f9-b06a9e62836e","Type":"ContainerStarted","Data":"c87f84f9f27a2d6cb574908c3031e9c0f2914a96f5af157028432a387025a4a0"} Mar 13 14:03:36 crc kubenswrapper[4898]: I0313 14:03:36.655546 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:36 crc kubenswrapper[4898]: I0313 14:03:36.663452 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:36 crc kubenswrapper[4898]: I0313 14:03:36.681600 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podStartSLOduration=4.68156389 podStartE2EDuration="4.68156389s" podCreationTimestamp="2026-03-13 14:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:03:36.676129764 +0000 UTC m=+451.677718073" watchObservedRunningTime="2026-03-13 14:03:36.68156389 +0000 UTC m=+451.683152169" Mar 13 14:03:40 crc kubenswrapper[4898]: I0313 14:03:40.466013 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:40 crc kubenswrapper[4898]: I0313 14:03:40.531394 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6n228"] Mar 13 14:03:49 crc kubenswrapper[4898]: I0313 14:03:49.134779 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:03:49 crc kubenswrapper[4898]: I0313 14:03:49.135188 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.920634 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dvvz2"] Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.921740 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dvvz2" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" containerName="registry-server" containerID="cri-o://8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec" gracePeriod=30 Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.928735 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-twh8h"] Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.929053 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-twh8h" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerName="registry-server" containerID="cri-o://c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c" gracePeriod=30 Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.934683 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8r99"] Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.935065 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" containerID="cri-o://fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24" gracePeriod=30 Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.950522 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h97c9"] Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.950833 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h97c9" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="registry-server" containerID="cri-o://cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8" gracePeriod=30 Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.958673 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z7ng7"] Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.959946 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.963740 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-974qp"] Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.964088 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-974qp" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="registry-server" containerID="cri-o://b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635" gracePeriod=30 Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.983736 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z7ng7"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.034506 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h75b\" (UniqueName: \"kubernetes.io/projected/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-kube-api-access-9h75b\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.034593 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.034629 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.135939 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.136006 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h75b\" (UniqueName: \"kubernetes.io/projected/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-kube-api-access-9h75b\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.136078 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.137513 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.149142 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.153809 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h75b\" (UniqueName: \"kubernetes.io/projected/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-kube-api-access-9h75b\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.302496 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.397129 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twh8h" Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.430575 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8 is running failed: container process not found" containerID="cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8" cmd=["grpc_health_probe","-addr=:50051"] Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.431022 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8 is running failed: container process not found" containerID="cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8" cmd=["grpc_health_probe","-addr=:50051"] Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.431275 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8 is running failed: container process not found" containerID="cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8" cmd=["grpc_health_probe","-addr=:50051"] Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.431301 4898 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-h97c9" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="registry-server" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.441334 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-utilities\") pod \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.441392 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x728\" (UniqueName: \"kubernetes.io/projected/8f81bcfc-3c35-48e8-a584-961351e8c0e2-kube-api-access-5x728\") pod \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.441459 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-catalog-content\") pod \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.442391 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-utilities" (OuterVolumeSpecName: "utilities") pod "8f81bcfc-3c35-48e8-a584-961351e8c0e2" (UID: "8f81bcfc-3c35-48e8-a584-961351e8c0e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.455398 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f81bcfc-3c35-48e8-a584-961351e8c0e2-kube-api-access-5x728" (OuterVolumeSpecName: "kube-api-access-5x728") pod "8f81bcfc-3c35-48e8-a584-961351e8c0e2" (UID: "8f81bcfc-3c35-48e8-a584-961351e8c0e2"). InnerVolumeSpecName "kube-api-access-5x728". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.483358 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-974qp" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.485702 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.488544 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.490363 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.517318 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f81bcfc-3c35-48e8-a584-961351e8c0e2" (UID: "8f81bcfc-3c35-48e8-a584-961351e8c0e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.542926 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-catalog-content\") pod \"183d86e9-cd5c-45ed-a460-bb6169e07c72\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.543363 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.543380 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x728\" (UniqueName: \"kubernetes.io/projected/8f81bcfc-3c35-48e8-a584-961351e8c0e2-kube-api-access-5x728\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.543390 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644280 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnv8v\" (UniqueName: \"kubernetes.io/projected/183d86e9-cd5c-45ed-a460-bb6169e07c72-kube-api-access-vnv8v\") pod \"183d86e9-cd5c-45ed-a460-bb6169e07c72\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644337 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-catalog-content\") pod \"43acaee8-efc8-4156-b28c-b493f241ac53\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644364 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-utilities\") pod \"43acaee8-efc8-4156-b28c-b493f241ac53\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644391 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhlq2\" (UniqueName: \"kubernetes.io/projected/43acaee8-efc8-4156-b28c-b493f241ac53-kube-api-access-zhlq2\") pod \"43acaee8-efc8-4156-b28c-b493f241ac53\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644473 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-utilities\") pod \"183d86e9-cd5c-45ed-a460-bb6169e07c72\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644513 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-operator-metrics\") pod \"0a78868f-1786-430d-8df8-18bb1c2019b3\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644550 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-catalog-content\") pod \"f85f72a8-3887-4867-8a9c-649992ce23f1\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644591 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbnpf\" (UniqueName: \"kubernetes.io/projected/0a78868f-1786-430d-8df8-18bb1c2019b3-kube-api-access-rbnpf\") pod \"0a78868f-1786-430d-8df8-18bb1c2019b3\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644615 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-utilities\") pod \"f85f72a8-3887-4867-8a9c-649992ce23f1\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644643 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5s4l\" (UniqueName: \"kubernetes.io/projected/f85f72a8-3887-4867-8a9c-649992ce23f1-kube-api-access-m5s4l\") pod \"f85f72a8-3887-4867-8a9c-649992ce23f1\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644682 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-trusted-ca\") pod \"0a78868f-1786-430d-8df8-18bb1c2019b3\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.645773 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-utilities" (OuterVolumeSpecName: "utilities") pod "43acaee8-efc8-4156-b28c-b493f241ac53" (UID: "43acaee8-efc8-4156-b28c-b493f241ac53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.645808 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-utilities" (OuterVolumeSpecName: "utilities") pod "f85f72a8-3887-4867-8a9c-649992ce23f1" (UID: "f85f72a8-3887-4867-8a9c-649992ce23f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.646131 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-utilities" (OuterVolumeSpecName: "utilities") pod "183d86e9-cd5c-45ed-a460-bb6169e07c72" (UID: "183d86e9-cd5c-45ed-a460-bb6169e07c72"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.648632 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "0a78868f-1786-430d-8df8-18bb1c2019b3" (UID: "0a78868f-1786-430d-8df8-18bb1c2019b3"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.649177 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f85f72a8-3887-4867-8a9c-649992ce23f1-kube-api-access-m5s4l" (OuterVolumeSpecName: "kube-api-access-m5s4l") pod "f85f72a8-3887-4867-8a9c-649992ce23f1" (UID: "f85f72a8-3887-4867-8a9c-649992ce23f1"). InnerVolumeSpecName "kube-api-access-m5s4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.649534 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "0a78868f-1786-430d-8df8-18bb1c2019b3" (UID: "0a78868f-1786-430d-8df8-18bb1c2019b3"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.649199 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43acaee8-efc8-4156-b28c-b493f241ac53-kube-api-access-zhlq2" (OuterVolumeSpecName: "kube-api-access-zhlq2") pod "43acaee8-efc8-4156-b28c-b493f241ac53" (UID: "43acaee8-efc8-4156-b28c-b493f241ac53"). InnerVolumeSpecName "kube-api-access-zhlq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.652078 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/183d86e9-cd5c-45ed-a460-bb6169e07c72-kube-api-access-vnv8v" (OuterVolumeSpecName: "kube-api-access-vnv8v") pod "183d86e9-cd5c-45ed-a460-bb6169e07c72" (UID: "183d86e9-cd5c-45ed-a460-bb6169e07c72"). InnerVolumeSpecName "kube-api-access-vnv8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.652878 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a78868f-1786-430d-8df8-18bb1c2019b3-kube-api-access-rbnpf" (OuterVolumeSpecName: "kube-api-access-rbnpf") pod "0a78868f-1786-430d-8df8-18bb1c2019b3" (UID: "0a78868f-1786-430d-8df8-18bb1c2019b3"). InnerVolumeSpecName "kube-api-access-rbnpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.682133 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "183d86e9-cd5c-45ed-a460-bb6169e07c72" (UID: "183d86e9-cd5c-45ed-a460-bb6169e07c72"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.685249 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f85f72a8-3887-4867-8a9c-649992ce23f1" (UID: "f85f72a8-3887-4867-8a9c-649992ce23f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.699762 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43acaee8-efc8-4156-b28c-b493f241ac53" (UID: "43acaee8-efc8-4156-b28c-b493f241ac53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.738856 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z7ng7"] Mar 13 14:03:58 crc kubenswrapper[4898]: W0313 14:03:58.744801 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8942bb7_1cd2_49b9_8d98_5ba4c5f6c320.slice/crio-7e9d8f7e38cb2d318f8401c5b79ee3fe1e97c72f16f88351b75ec47ffe85f807 WatchSource:0}: Error finding container 7e9d8f7e38cb2d318f8401c5b79ee3fe1e97c72f16f88351b75ec47ffe85f807: Status 404 returned error can't find the container with id 7e9d8f7e38cb2d318f8401c5b79ee3fe1e97c72f16f88351b75ec47ffe85f807 Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.745590 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.745621 4898 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.745633 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.745642 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbnpf\" (UniqueName: \"kubernetes.io/projected/0a78868f-1786-430d-8df8-18bb1c2019b3-kube-api-access-rbnpf\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.745652 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.745660 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5s4l\" (UniqueName: \"kubernetes.io/projected/f85f72a8-3887-4867-8a9c-649992ce23f1-kube-api-access-m5s4l\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.745668 4898 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.746080 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnv8v\" (UniqueName: \"kubernetes.io/projected/183d86e9-cd5c-45ed-a460-bb6169e07c72-kube-api-access-vnv8v\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.746352 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.746365 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.746375 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhlq2\" (UniqueName: \"kubernetes.io/projected/43acaee8-efc8-4156-b28c-b493f241ac53-kube-api-access-zhlq2\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.746383 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.840342 4898 generic.go:334] "Generic (PLEG): container finished" podID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerID="cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8" exitCode=0 Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.840417 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h97c9" event={"ID":"f85f72a8-3887-4867-8a9c-649992ce23f1","Type":"ContainerDied","Data":"cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.840447 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h97c9" event={"ID":"f85f72a8-3887-4867-8a9c-649992ce23f1","Type":"ContainerDied","Data":"da11d51940a63fb9fd52ba5896a1fe2ba45d932b66b7a36000029b7816a483fc"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.840466 4898 scope.go:117] "RemoveContainer" containerID="cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.840828 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.843708 4898 generic.go:334] "Generic (PLEG): container finished" podID="43acaee8-efc8-4156-b28c-b493f241ac53" containerID="8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec" exitCode=0 Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.843756 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvvz2" event={"ID":"43acaee8-efc8-4156-b28c-b493f241ac53","Type":"ContainerDied","Data":"8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.843775 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvvz2" event={"ID":"43acaee8-efc8-4156-b28c-b493f241ac53","Type":"ContainerDied","Data":"6d70382f54646dad1c6a01020a09851e8f00eda076ad91d5aba2e586ae668444"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.843822 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.852654 4898 generic.go:334] "Generic (PLEG): container finished" podID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerID="b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635" exitCode=0 Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.852703 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-974qp" event={"ID":"183d86e9-cd5c-45ed-a460-bb6169e07c72","Type":"ContainerDied","Data":"b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.852725 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-974qp" event={"ID":"183d86e9-cd5c-45ed-a460-bb6169e07c72","Type":"ContainerDied","Data":"3597e8f057d81527e3da3a21c0723e2cabe95896c1c2879fe09fef6825e9aab7"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.852780 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-974qp" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.855475 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" event={"ID":"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320","Type":"ContainerStarted","Data":"7e9d8f7e38cb2d318f8401c5b79ee3fe1e97c72f16f88351b75ec47ffe85f807"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.859520 4898 generic.go:334] "Generic (PLEG): container finished" podID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerID="fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24" exitCode=0 Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.859565 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" event={"ID":"0a78868f-1786-430d-8df8-18bb1c2019b3","Type":"ContainerDied","Data":"fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.859601 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.859616 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" event={"ID":"0a78868f-1786-430d-8df8-18bb1c2019b3","Type":"ContainerDied","Data":"5ce4caec01bc9ee8df0b59f3f0251f9037b82e485a55597652071608caca296b"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.860952 4898 scope.go:117] "RemoveContainer" containerID="8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.865426 4898 generic.go:334] "Generic (PLEG): container finished" podID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerID="c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c" exitCode=0 Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.865512 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twh8h" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.865526 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twh8h" event={"ID":"8f81bcfc-3c35-48e8-a584-961351e8c0e2","Type":"ContainerDied","Data":"c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.865966 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twh8h" event={"ID":"8f81bcfc-3c35-48e8-a584-961351e8c0e2","Type":"ContainerDied","Data":"81fb34feaf2adf00d5d07da217b484c8e9d6cdeb7a039901668613864eddf170"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.880713 4898 scope.go:117] "RemoveContainer" containerID="6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.891098 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dvvz2"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.901244 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dvvz2"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.914044 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h97c9"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.914557 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h97c9"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.918114 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-974qp"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.925030 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-974qp"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.931736 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-twh8h"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.939619 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-twh8h"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.940874 4898 scope.go:117] "RemoveContainer" containerID="cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8" Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.941461 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8\": container with ID starting with cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8 not found: ID does not exist" containerID="cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.941494 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8"} err="failed to get container status \"cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8\": rpc error: code = NotFound desc = could not find container \"cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8\": container with ID starting with cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8 not found: ID does not exist" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.941523 4898 scope.go:117] "RemoveContainer" containerID="8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d" Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.941759 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d\": container with ID starting with 8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d not found: ID does not exist" containerID="8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.941782 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d"} err="failed to get container status \"8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d\": rpc error: code = NotFound desc = could not find container \"8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d\": container with ID starting with 8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d not found: ID does not exist" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.941799 4898 scope.go:117] "RemoveContainer" containerID="6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485" Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.942061 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485\": container with ID starting with 6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485 not found: ID does not exist" containerID="6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.942101 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485"} err="failed to get container status \"6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485\": rpc error: code = NotFound desc = could not find container \"6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485\": container with ID starting with 6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485 not found: ID does not exist" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.942120 4898 scope.go:117] "RemoveContainer" containerID="8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.951109 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8r99"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.955235 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8r99"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.956290 4898 scope.go:117] "RemoveContainer" containerID="166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.971514 4898 scope.go:117] "RemoveContainer" containerID="2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.991415 4898 scope.go:117] "RemoveContainer" containerID="8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec" Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.991884 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec\": container with ID starting with 8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec not found: ID does not exist" containerID="8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.991952 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec"} err="failed to get container status \"8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec\": rpc error: code = NotFound desc = could not find container \"8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec\": container with ID starting with 8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec not found: ID does not exist" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.991982 4898 scope.go:117] "RemoveContainer" containerID="166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b" Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.992273 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b\": container with ID starting with 166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b not found: ID does not exist" containerID="166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.992302 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b"} err="failed to get container status \"166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b\": rpc error: code = NotFound desc = could not find container \"166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b\": container with ID starting with 166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b not found: ID does not exist" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.992322 4898 scope.go:117] "RemoveContainer" containerID="2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf" Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.992566 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf\": container with ID starting with 2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf not found: ID does not exist" containerID="2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.992589 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf"} err="failed to get container status \"2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf\": rpc error: code = NotFound desc = could not find container \"2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf\": container with ID starting with 2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf not found: ID does not exist" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.992604 4898 scope.go:117] "RemoveContainer" containerID="b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.006448 4898 scope.go:117] "RemoveContainer" containerID="4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.031327 4898 scope.go:117] "RemoveContainer" containerID="7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.044517 4898 scope.go:117] "RemoveContainer" containerID="b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635" Mar 13 14:03:59 crc kubenswrapper[4898]: E0313 14:03:59.044882 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635\": container with ID starting with b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635 not found: ID does not exist" containerID="b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.044950 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635"} err="failed to get container status \"b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635\": rpc error: code = NotFound desc = could not find container \"b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635\": container with ID starting with b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635 not found: ID does not exist" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.044979 4898 scope.go:117] "RemoveContainer" containerID="4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47" Mar 13 14:03:59 crc kubenswrapper[4898]: E0313 14:03:59.045287 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47\": container with ID starting with 4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47 not found: ID does not exist" containerID="4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.045320 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47"} err="failed to get container status \"4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47\": rpc error: code = NotFound desc = could not find container \"4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47\": container with ID starting with 4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47 not found: ID does not exist" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.045346 4898 scope.go:117] "RemoveContainer" containerID="7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248" Mar 13 14:03:59 crc kubenswrapper[4898]: E0313 14:03:59.045661 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248\": container with ID starting with 7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248 not found: ID does not exist" containerID="7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.045683 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248"} err="failed to get container status \"7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248\": rpc error: code = NotFound desc = could not find container \"7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248\": container with ID starting with 7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248 not found: ID does not exist" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.045701 4898 scope.go:117] "RemoveContainer" containerID="fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.057697 4898 scope.go:117] "RemoveContainer" containerID="66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.072438 4898 scope.go:117] "RemoveContainer" containerID="fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24" Mar 13 14:03:59 crc kubenswrapper[4898]: E0313 14:03:59.072765 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24\": container with ID starting with fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24 not found: ID does not exist" containerID="fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.072797 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24"} err="failed to get container status \"fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24\": rpc error: code = NotFound desc = could not find container \"fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24\": container with ID starting with fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24 not found: ID does not exist" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.072825 4898 scope.go:117] "RemoveContainer" containerID="66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67" Mar 13 14:03:59 crc kubenswrapper[4898]: E0313 14:03:59.073049 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67\": container with ID starting with 66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67 not found: ID does not exist" containerID="66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.073072 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67"} err="failed to get container status \"66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67\": rpc error: code = NotFound desc = could not find container \"66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67\": container with ID starting with 66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67 not found: ID does not exist" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.073088 4898 scope.go:117] "RemoveContainer" containerID="c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.121927 4898 scope.go:117] "RemoveContainer" containerID="2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.134187 4898 scope.go:117] "RemoveContainer" containerID="af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.150969 4898 scope.go:117] "RemoveContainer" containerID="c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c" Mar 13 14:03:59 crc kubenswrapper[4898]: E0313 14:03:59.151366 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c\": container with ID starting with c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c not found: ID does not exist" containerID="c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.151403 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c"} err="failed to get container status \"c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c\": rpc error: code = NotFound desc = could not find container \"c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c\": container with ID starting with c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c not found: ID does not exist" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.151432 4898 scope.go:117] "RemoveContainer" containerID="2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336" Mar 13 14:03:59 crc kubenswrapper[4898]: E0313 14:03:59.151723 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336\": container with ID starting with 2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336 not found: ID does not exist" containerID="2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.151749 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336"} err="failed to get container status \"2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336\": rpc error: code = NotFound desc = could not find container \"2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336\": container with ID starting with 2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336 not found: ID does not exist" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.151766 4898 scope.go:117] "RemoveContainer" containerID="af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380" Mar 13 14:03:59 crc kubenswrapper[4898]: E0313 14:03:59.152204 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380\": container with ID starting with af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380 not found: ID does not exist" containerID="af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.152230 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380"} err="failed to get container status \"af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380\": rpc error: code = NotFound desc = could not find container \"af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380\": container with ID starting with af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380 not found: ID does not exist" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.751830 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" path="/var/lib/kubelet/pods/0a78868f-1786-430d-8df8-18bb1c2019b3/volumes" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.753868 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" path="/var/lib/kubelet/pods/183d86e9-cd5c-45ed-a460-bb6169e07c72/volumes" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.755654 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" path="/var/lib/kubelet/pods/43acaee8-efc8-4156-b28c-b493f241ac53/volumes" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.757788 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" path="/var/lib/kubelet/pods/8f81bcfc-3c35-48e8-a584-961351e8c0e2/volumes" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.759110 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" path="/var/lib/kubelet/pods/f85f72a8-3887-4867-8a9c-649992ce23f1/volumes" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.880320 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" event={"ID":"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320","Type":"ContainerStarted","Data":"eb2daa7a5834deb74ab00e016de96548278888d4dc2100cb3f45f5181eb8442b"} Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.880712 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.885079 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.896579 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" podStartSLOduration=2.896560859 podStartE2EDuration="2.896560859s" podCreationTimestamp="2026-03-13 14:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:03:59.894625776 +0000 UTC m=+474.896214035" watchObservedRunningTime="2026-03-13 14:03:59.896560859 +0000 UTC m=+474.898149098" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128234 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hkbng"] Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128466 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128479 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128491 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="extract-utilities" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128499 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="extract-utilities" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128513 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="extract-utilities" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128520 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="extract-utilities" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128535 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128543 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128553 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128560 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128569 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerName="extract-utilities" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128577 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerName="extract-utilities" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128590 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="extract-content" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128597 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="extract-content" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128609 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" containerName="extract-content" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128616 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" containerName="extract-content" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128624 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerName="extract-content" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128633 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerName="extract-content" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128643 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128650 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128660 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" containerName="extract-utilities" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128668 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" containerName="extract-utilities" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128677 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128685 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128696 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="extract-content" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128704 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="extract-content" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128713 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128721 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128821 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128831 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128842 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128852 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128862 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128875 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.129670 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.131875 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.134380 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556844-q7h28"] Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.135747 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556844-q7h28" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.139344 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.139444 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.140065 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.145999 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hkbng"] Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.153655 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556844-q7h28"] Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.166545 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-utilities\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.166608 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbt6s\" (UniqueName: \"kubernetes.io/projected/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-kube-api-access-jbt6s\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.166627 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-catalog-content\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.166658 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dccvw\" (UniqueName: \"kubernetes.io/projected/cd30282f-65c8-45d8-89f3-c6e2f16662d4-kube-api-access-dccvw\") pod \"auto-csr-approver-29556844-q7h28\" (UID: \"cd30282f-65c8-45d8-89f3-c6e2f16662d4\") " pod="openshift-infra/auto-csr-approver-29556844-q7h28" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.267340 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbt6s\" (UniqueName: \"kubernetes.io/projected/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-kube-api-access-jbt6s\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.267391 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-catalog-content\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.267435 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dccvw\" (UniqueName: \"kubernetes.io/projected/cd30282f-65c8-45d8-89f3-c6e2f16662d4-kube-api-access-dccvw\") pod \"auto-csr-approver-29556844-q7h28\" (UID: \"cd30282f-65c8-45d8-89f3-c6e2f16662d4\") " pod="openshift-infra/auto-csr-approver-29556844-q7h28" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.267881 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-utilities\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.268371 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-catalog-content\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.268381 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-utilities\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.287877 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dccvw\" (UniqueName: \"kubernetes.io/projected/cd30282f-65c8-45d8-89f3-c6e2f16662d4-kube-api-access-dccvw\") pod \"auto-csr-approver-29556844-q7h28\" (UID: \"cd30282f-65c8-45d8-89f3-c6e2f16662d4\") " pod="openshift-infra/auto-csr-approver-29556844-q7h28" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.290761 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbt6s\" (UniqueName: \"kubernetes.io/projected/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-kube-api-access-jbt6s\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.322551 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zs42q"] Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.323459 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.326242 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.338465 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs42q"] Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.468990 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.470444 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0182307e-bc7f-415e-a0f9-0eff9902384c-catalog-content\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.470555 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktr7f\" (UniqueName: \"kubernetes.io/projected/0182307e-bc7f-415e-a0f9-0eff9902384c-kube-api-access-ktr7f\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.470630 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0182307e-bc7f-415e-a0f9-0eff9902384c-utilities\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.475241 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556844-q7h28" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.571236 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktr7f\" (UniqueName: \"kubernetes.io/projected/0182307e-bc7f-415e-a0f9-0eff9902384c-kube-api-access-ktr7f\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.571288 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0182307e-bc7f-415e-a0f9-0eff9902384c-utilities\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.571325 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0182307e-bc7f-415e-a0f9-0eff9902384c-catalog-content\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.571979 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0182307e-bc7f-415e-a0f9-0eff9902384c-catalog-content\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.572349 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0182307e-bc7f-415e-a0f9-0eff9902384c-utilities\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.597391 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktr7f\" (UniqueName: \"kubernetes.io/projected/0182307e-bc7f-415e-a0f9-0eff9902384c-kube-api-access-ktr7f\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.648294 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.929299 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556844-q7h28"] Mar 13 14:04:00 crc kubenswrapper[4898]: W0313 14:04:00.935879 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd30282f_65c8_45d8_89f3_c6e2f16662d4.slice/crio-752319ba8672dad76676eb802295413482abd90351347d3efb3c54acb0646542 WatchSource:0}: Error finding container 752319ba8672dad76676eb802295413482abd90351347d3efb3c54acb0646542: Status 404 returned error can't find the container with id 752319ba8672dad76676eb802295413482abd90351347d3efb3c54acb0646542 Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.980443 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hkbng"] Mar 13 14:04:00 crc kubenswrapper[4898]: W0313 14:04:00.982828 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89abe4ad_dd62_4a70_a1d1_fdf97448ada5.slice/crio-3c34c6f1ba5d87a1c1ee78207d34fa953dad0e25b7fe710e85f481b7516936ec WatchSource:0}: Error finding container 3c34c6f1ba5d87a1c1ee78207d34fa953dad0e25b7fe710e85f481b7516936ec: Status 404 returned error can't find the container with id 3c34c6f1ba5d87a1c1ee78207d34fa953dad0e25b7fe710e85f481b7516936ec Mar 13 14:04:01 crc kubenswrapper[4898]: I0313 14:04:01.067791 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs42q"] Mar 13 14:04:01 crc kubenswrapper[4898]: W0313 14:04:01.067946 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0182307e_bc7f_415e_a0f9_0eff9902384c.slice/crio-129becc9925ab90f29d2c0f42b27f347625f74a5adb7cf39ed0042115e4b864d WatchSource:0}: Error finding container 129becc9925ab90f29d2c0f42b27f347625f74a5adb7cf39ed0042115e4b864d: Status 404 returned error can't find the container with id 129becc9925ab90f29d2c0f42b27f347625f74a5adb7cf39ed0042115e4b864d Mar 13 14:04:01 crc kubenswrapper[4898]: I0313 14:04:01.896644 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556844-q7h28" event={"ID":"cd30282f-65c8-45d8-89f3-c6e2f16662d4","Type":"ContainerStarted","Data":"752319ba8672dad76676eb802295413482abd90351347d3efb3c54acb0646542"} Mar 13 14:04:01 crc kubenswrapper[4898]: I0313 14:04:01.898509 4898 generic.go:334] "Generic (PLEG): container finished" podID="89abe4ad-dd62-4a70-a1d1-fdf97448ada5" containerID="60bb08d44387c7849286247a4451083d86802422c22c69c6d73ce6c5a8459355" exitCode=0 Mar 13 14:04:01 crc kubenswrapper[4898]: I0313 14:04:01.898598 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkbng" event={"ID":"89abe4ad-dd62-4a70-a1d1-fdf97448ada5","Type":"ContainerDied","Data":"60bb08d44387c7849286247a4451083d86802422c22c69c6d73ce6c5a8459355"} Mar 13 14:04:01 crc kubenswrapper[4898]: I0313 14:04:01.898629 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkbng" event={"ID":"89abe4ad-dd62-4a70-a1d1-fdf97448ada5","Type":"ContainerStarted","Data":"3c34c6f1ba5d87a1c1ee78207d34fa953dad0e25b7fe710e85f481b7516936ec"} Mar 13 14:04:01 crc kubenswrapper[4898]: I0313 14:04:01.903978 4898 generic.go:334] "Generic (PLEG): container finished" podID="0182307e-bc7f-415e-a0f9-0eff9902384c" containerID="69485b46bd03239510d094d6d7c5c20008e9439faab5356d9bceb41ec96e8a78" exitCode=0 Mar 13 14:04:01 crc kubenswrapper[4898]: I0313 14:04:01.904041 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs42q" event={"ID":"0182307e-bc7f-415e-a0f9-0eff9902384c","Type":"ContainerDied","Data":"69485b46bd03239510d094d6d7c5c20008e9439faab5356d9bceb41ec96e8a78"} Mar 13 14:04:01 crc kubenswrapper[4898]: I0313 14:04:01.904108 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs42q" event={"ID":"0182307e-bc7f-415e-a0f9-0eff9902384c","Type":"ContainerStarted","Data":"129becc9925ab90f29d2c0f42b27f347625f74a5adb7cf39ed0042115e4b864d"} Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.518581 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zgjzn"] Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.519763 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.524724 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.535690 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zgjzn"] Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.603281 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb51f06-0778-4b18-82b5-c5ce91e0a613-utilities\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.603334 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb51f06-0778-4b18-82b5-c5ce91e0a613-catalog-content\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.603392 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdvtn\" (UniqueName: \"kubernetes.io/projected/cbb51f06-0778-4b18-82b5-c5ce91e0a613-kube-api-access-gdvtn\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.704129 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb51f06-0778-4b18-82b5-c5ce91e0a613-utilities\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.704191 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb51f06-0778-4b18-82b5-c5ce91e0a613-catalog-content\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.704245 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdvtn\" (UniqueName: \"kubernetes.io/projected/cbb51f06-0778-4b18-82b5-c5ce91e0a613-kube-api-access-gdvtn\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.706652 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb51f06-0778-4b18-82b5-c5ce91e0a613-utilities\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.706594 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb51f06-0778-4b18-82b5-c5ce91e0a613-catalog-content\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.728498 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nf9mj"] Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.730364 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.733721 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.737108 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdvtn\" (UniqueName: \"kubernetes.io/projected/cbb51f06-0778-4b18-82b5-c5ce91e0a613-kube-api-access-gdvtn\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.743364 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nf9mj"] Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.806225 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-catalog-content\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.806295 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z75w9\" (UniqueName: \"kubernetes.io/projected/112ac477-caf1-4778-9161-737e393633b6-kube-api-access-z75w9\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.806377 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-utilities\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.838035 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.907402 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-catalog-content\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.907449 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z75w9\" (UniqueName: \"kubernetes.io/projected/112ac477-caf1-4778-9161-737e393633b6-kube-api-access-z75w9\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.907483 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-utilities\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.912174 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-catalog-content\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.912196 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-utilities\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.932816 4898 generic.go:334] "Generic (PLEG): container finished" podID="cd30282f-65c8-45d8-89f3-c6e2f16662d4" containerID="59d89d033eeab55992b3ec00208c3b5cec577e8de3d6a36471e4a08df49334b0" exitCode=0 Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.932884 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556844-q7h28" event={"ID":"cd30282f-65c8-45d8-89f3-c6e2f16662d4","Type":"ContainerDied","Data":"59d89d033eeab55992b3ec00208c3b5cec577e8de3d6a36471e4a08df49334b0"} Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.943845 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkbng" event={"ID":"89abe4ad-dd62-4a70-a1d1-fdf97448ada5","Type":"ContainerStarted","Data":"e7e84b8c62fe1cfd24a053035a9551f9c1c3bd118f328f277a4430d9e847e65e"} Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.950396 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z75w9\" (UniqueName: \"kubernetes.io/projected/112ac477-caf1-4778-9161-737e393633b6-kube-api-access-z75w9\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.073830 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.299129 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zgjzn"] Mar 13 14:04:03 crc kubenswrapper[4898]: W0313 14:04:03.308948 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb51f06_0778_4b18_82b5_c5ce91e0a613.slice/crio-baffb80482a04d378c0c02b555c50178f36cf386c59b550dce65769fa88f6740 WatchSource:0}: Error finding container baffb80482a04d378c0c02b555c50178f36cf386c59b550dce65769fa88f6740: Status 404 returned error can't find the container with id baffb80482a04d378c0c02b555c50178f36cf386c59b550dce65769fa88f6740 Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.480640 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nf9mj"] Mar 13 14:04:03 crc kubenswrapper[4898]: W0313 14:04:03.525770 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod112ac477_caf1_4778_9161_737e393633b6.slice/crio-cd6fb5a64e7e71f880fc652eb99feafe0af0462b20f85dff2e92b55a99558f6f WatchSource:0}: Error finding container cd6fb5a64e7e71f880fc652eb99feafe0af0462b20f85dff2e92b55a99558f6f: Status 404 returned error can't find the container with id cd6fb5a64e7e71f880fc652eb99feafe0af0462b20f85dff2e92b55a99558f6f Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.953303 4898 generic.go:334] "Generic (PLEG): container finished" podID="cbb51f06-0778-4b18-82b5-c5ce91e0a613" containerID="f784e80c2b12666e1f35ec79baad6f9df6ad51fb651197a806468000ac6a2641" exitCode=0 Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.953415 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjzn" event={"ID":"cbb51f06-0778-4b18-82b5-c5ce91e0a613","Type":"ContainerDied","Data":"f784e80c2b12666e1f35ec79baad6f9df6ad51fb651197a806468000ac6a2641"} Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.953743 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjzn" event={"ID":"cbb51f06-0778-4b18-82b5-c5ce91e0a613","Type":"ContainerStarted","Data":"baffb80482a04d378c0c02b555c50178f36cf386c59b550dce65769fa88f6740"} Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.956575 4898 generic.go:334] "Generic (PLEG): container finished" podID="89abe4ad-dd62-4a70-a1d1-fdf97448ada5" containerID="e7e84b8c62fe1cfd24a053035a9551f9c1c3bd118f328f277a4430d9e847e65e" exitCode=0 Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.957020 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkbng" event={"ID":"89abe4ad-dd62-4a70-a1d1-fdf97448ada5","Type":"ContainerDied","Data":"e7e84b8c62fe1cfd24a053035a9551f9c1c3bd118f328f277a4430d9e847e65e"} Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.966225 4898 generic.go:334] "Generic (PLEG): container finished" podID="0182307e-bc7f-415e-a0f9-0eff9902384c" containerID="3c62fb6c0c0a5d7610c81eb3bac9a0463b266c6fe606fcf0f6eaa3106384f862" exitCode=0 Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.966346 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs42q" event={"ID":"0182307e-bc7f-415e-a0f9-0eff9902384c","Type":"ContainerDied","Data":"3c62fb6c0c0a5d7610c81eb3bac9a0463b266c6fe606fcf0f6eaa3106384f862"} Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.968844 4898 generic.go:334] "Generic (PLEG): container finished" podID="112ac477-caf1-4778-9161-737e393633b6" containerID="2316a3e4d2fc964fff3bdea961936abc15de57cb9446d0c3ca366fa8840b5460" exitCode=0 Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.968890 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nf9mj" event={"ID":"112ac477-caf1-4778-9161-737e393633b6","Type":"ContainerDied","Data":"2316a3e4d2fc964fff3bdea961936abc15de57cb9446d0c3ca366fa8840b5460"} Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.968979 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nf9mj" event={"ID":"112ac477-caf1-4778-9161-737e393633b6","Type":"ContainerStarted","Data":"cd6fb5a64e7e71f880fc652eb99feafe0af0462b20f85dff2e92b55a99558f6f"} Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.339576 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556844-q7h28" Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.535006 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dccvw\" (UniqueName: \"kubernetes.io/projected/cd30282f-65c8-45d8-89f3-c6e2f16662d4-kube-api-access-dccvw\") pod \"cd30282f-65c8-45d8-89f3-c6e2f16662d4\" (UID: \"cd30282f-65c8-45d8-89f3-c6e2f16662d4\") " Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.540517 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd30282f-65c8-45d8-89f3-c6e2f16662d4-kube-api-access-dccvw" (OuterVolumeSpecName: "kube-api-access-dccvw") pod "cd30282f-65c8-45d8-89f3-c6e2f16662d4" (UID: "cd30282f-65c8-45d8-89f3-c6e2f16662d4"). InnerVolumeSpecName "kube-api-access-dccvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.637435 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dccvw\" (UniqueName: \"kubernetes.io/projected/cd30282f-65c8-45d8-89f3-c6e2f16662d4-kube-api-access-dccvw\") on node \"crc\" DevicePath \"\"" Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.980760 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs42q" event={"ID":"0182307e-bc7f-415e-a0f9-0eff9902384c","Type":"ContainerStarted","Data":"77d9766c8fecf5c86dafd5750df2ef49509322338b053c35ad79ac796cac5820"} Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.982934 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556844-q7h28" event={"ID":"cd30282f-65c8-45d8-89f3-c6e2f16662d4","Type":"ContainerDied","Data":"752319ba8672dad76676eb802295413482abd90351347d3efb3c54acb0646542"} Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.982984 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="752319ba8672dad76676eb802295413482abd90351347d3efb3c54acb0646542" Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.983065 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556844-q7h28" Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.987410 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nf9mj" event={"ID":"112ac477-caf1-4778-9161-737e393633b6","Type":"ContainerStarted","Data":"2e43cc9675fd8cbd58a911f4205da76807b5b902a7fc3bd0c4cb735298e250c5"} Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.990408 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjzn" event={"ID":"cbb51f06-0778-4b18-82b5-c5ce91e0a613","Type":"ContainerStarted","Data":"779306da39107e22f5c6e9064f03a6b5cd0fac4a5dac5550d00ae8a28dc13c8f"} Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.992529 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkbng" event={"ID":"89abe4ad-dd62-4a70-a1d1-fdf97448ada5","Type":"ContainerStarted","Data":"7d332847a2571b0892a4d2d051dd5994998dc75188735a88d9bfbfb47991cdd2"} Mar 13 14:04:05 crc kubenswrapper[4898]: I0313 14:04:05.001535 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zs42q" podStartSLOduration=2.375612707 podStartE2EDuration="5.001514509s" podCreationTimestamp="2026-03-13 14:04:00 +0000 UTC" firstStartedPulling="2026-03-13 14:04:01.906016087 +0000 UTC m=+476.907604336" lastFinishedPulling="2026-03-13 14:04:04.531917899 +0000 UTC m=+479.533506138" observedRunningTime="2026-03-13 14:04:04.9982953 +0000 UTC m=+479.999883579" watchObservedRunningTime="2026-03-13 14:04:05.001514509 +0000 UTC m=+480.003102748" Mar 13 14:04:05 crc kubenswrapper[4898]: I0313 14:04:05.022021 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hkbng" podStartSLOduration=2.347983617 podStartE2EDuration="5.021995613s" podCreationTimestamp="2026-03-13 14:04:00 +0000 UTC" firstStartedPulling="2026-03-13 14:04:01.900522246 +0000 UTC m=+476.902110485" lastFinishedPulling="2026-03-13 14:04:04.574534242 +0000 UTC m=+479.576122481" observedRunningTime="2026-03-13 14:04:05.017041626 +0000 UTC m=+480.018629875" watchObservedRunningTime="2026-03-13 14:04:05.021995613 +0000 UTC m=+480.023583852" Mar 13 14:04:05 crc kubenswrapper[4898]: I0313 14:04:05.407248 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556838-h7pkr"] Mar 13 14:04:05 crc kubenswrapper[4898]: I0313 14:04:05.410527 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556838-h7pkr"] Mar 13 14:04:05 crc kubenswrapper[4898]: I0313 14:04:05.579340 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" podUID="b08c305d-b9fc-4c5c-85c1-8281b9608bcf" containerName="registry" containerID="cri-o://f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2" gracePeriod=30 Mar 13 14:04:05 crc kubenswrapper[4898]: I0313 14:04:05.745909 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa1ed4c8-e4bd-4352-bee3-404f16244ea3" path="/var/lib/kubelet/pods/aa1ed4c8-e4bd-4352-bee3-404f16244ea3/volumes" Mar 13 14:04:05 crc kubenswrapper[4898]: I0313 14:04:05.989150 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.000337 4898 generic.go:334] "Generic (PLEG): container finished" podID="112ac477-caf1-4778-9161-737e393633b6" containerID="2e43cc9675fd8cbd58a911f4205da76807b5b902a7fc3bd0c4cb735298e250c5" exitCode=0 Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.000410 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nf9mj" event={"ID":"112ac477-caf1-4778-9161-737e393633b6","Type":"ContainerDied","Data":"2e43cc9675fd8cbd58a911f4205da76807b5b902a7fc3bd0c4cb735298e250c5"} Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.000461 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nf9mj" event={"ID":"112ac477-caf1-4778-9161-737e393633b6","Type":"ContainerStarted","Data":"fc60b3fabeb85294acc6203495a9c722283813bdd45f914886621b87378fb993"} Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.002018 4898 generic.go:334] "Generic (PLEG): container finished" podID="cbb51f06-0778-4b18-82b5-c5ce91e0a613" containerID="779306da39107e22f5c6e9064f03a6b5cd0fac4a5dac5550d00ae8a28dc13c8f" exitCode=0 Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.002075 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjzn" event={"ID":"cbb51f06-0778-4b18-82b5-c5ce91e0a613","Type":"ContainerDied","Data":"779306da39107e22f5c6e9064f03a6b5cd0fac4a5dac5550d00ae8a28dc13c8f"} Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.003188 4898 generic.go:334] "Generic (PLEG): container finished" podID="b08c305d-b9fc-4c5c-85c1-8281b9608bcf" containerID="f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2" exitCode=0 Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.003224 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.003244 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" event={"ID":"b08c305d-b9fc-4c5c-85c1-8281b9608bcf","Type":"ContainerDied","Data":"f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2"} Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.003263 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" event={"ID":"b08c305d-b9fc-4c5c-85c1-8281b9608bcf","Type":"ContainerDied","Data":"cb30f09f65c6668eae49d8e2a5f1518ff1c19e2eb8fcc21bf1f743165319e716"} Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.003280 4898 scope.go:117] "RemoveContainer" containerID="f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.016930 4898 scope.go:117] "RemoveContainer" containerID="f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2" Mar 13 14:04:06 crc kubenswrapper[4898]: E0313 14:04:06.017450 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2\": container with ID starting with f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2 not found: ID does not exist" containerID="f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.017485 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2"} err="failed to get container status \"f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2\": rpc error: code = NotFound desc = could not find container \"f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2\": container with ID starting with f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2 not found: ID does not exist" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.034822 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nf9mj" podStartSLOduration=2.560395544 podStartE2EDuration="4.034806359s" podCreationTimestamp="2026-03-13 14:04:02 +0000 UTC" firstStartedPulling="2026-03-13 14:04:03.970938483 +0000 UTC m=+478.972526722" lastFinishedPulling="2026-03-13 14:04:05.445349298 +0000 UTC m=+480.446937537" observedRunningTime="2026-03-13 14:04:06.034230463 +0000 UTC m=+481.035818722" watchObservedRunningTime="2026-03-13 14:04:06.034806359 +0000 UTC m=+481.036394598" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.154360 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-bound-sa-token\") pod \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.154408 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-ca-trust-extracted\") pod \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.154427 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-tls\") pod \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.154462 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-installation-pull-secrets\") pod \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.154491 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-trusted-ca\") pod \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.154515 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt5s8\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-kube-api-access-xt5s8\") pod \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.154564 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-certificates\") pod \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.155084 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b08c305d-b9fc-4c5c-85c1-8281b9608bcf" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.155220 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b08c305d-b9fc-4c5c-85c1-8281b9608bcf" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.155400 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.156159 4898 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.156727 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.166763 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b08c305d-b9fc-4c5c-85c1-8281b9608bcf" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.171097 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b08c305d-b9fc-4c5c-85c1-8281b9608bcf" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.171730 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-kube-api-access-xt5s8" (OuterVolumeSpecName: "kube-api-access-xt5s8") pod "b08c305d-b9fc-4c5c-85c1-8281b9608bcf" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf"). InnerVolumeSpecName "kube-api-access-xt5s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.172362 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b08c305d-b9fc-4c5c-85c1-8281b9608bcf" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.178732 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b08c305d-b9fc-4c5c-85c1-8281b9608bcf" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.179966 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b08c305d-b9fc-4c5c-85c1-8281b9608bcf" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.257703 4898 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.257744 4898 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.257758 4898 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.257770 4898 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.257783 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt5s8\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-kube-api-access-xt5s8\") on node \"crc\" DevicePath \"\"" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.352446 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6n228"] Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.356757 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6n228"] Mar 13 14:04:07 crc kubenswrapper[4898]: I0313 14:04:07.012795 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjzn" event={"ID":"cbb51f06-0778-4b18-82b5-c5ce91e0a613","Type":"ContainerStarted","Data":"0bfe2329bd880746bbd5197bafddd64a03425f52194e76f04bc937b6e425402a"} Mar 13 14:04:07 crc kubenswrapper[4898]: I0313 14:04:07.036490 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zgjzn" podStartSLOduration=2.566317916 podStartE2EDuration="5.036471029s" podCreationTimestamp="2026-03-13 14:04:02 +0000 UTC" firstStartedPulling="2026-03-13 14:04:03.955626621 +0000 UTC m=+478.957214880" lastFinishedPulling="2026-03-13 14:04:06.425779754 +0000 UTC m=+481.427367993" observedRunningTime="2026-03-13 14:04:07.035362788 +0000 UTC m=+482.036951047" watchObservedRunningTime="2026-03-13 14:04:07.036471029 +0000 UTC m=+482.038059268" Mar 13 14:04:07 crc kubenswrapper[4898]: I0313 14:04:07.751224 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b08c305d-b9fc-4c5c-85c1-8281b9608bcf" path="/var/lib/kubelet/pods/b08c305d-b9fc-4c5c-85c1-8281b9608bcf/volumes" Mar 13 14:04:10 crc kubenswrapper[4898]: I0313 14:04:10.469303 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:10 crc kubenswrapper[4898]: I0313 14:04:10.469760 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:10 crc kubenswrapper[4898]: I0313 14:04:10.521310 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:10 crc kubenswrapper[4898]: I0313 14:04:10.649071 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:10 crc kubenswrapper[4898]: I0313 14:04:10.649143 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:10 crc kubenswrapper[4898]: I0313 14:04:10.685608 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:11 crc kubenswrapper[4898]: I0313 14:04:11.072689 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:11 crc kubenswrapper[4898]: I0313 14:04:11.082289 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:12 crc kubenswrapper[4898]: I0313 14:04:12.839341 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:12 crc kubenswrapper[4898]: I0313 14:04:12.839425 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:13 crc kubenswrapper[4898]: I0313 14:04:13.074174 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:13 crc kubenswrapper[4898]: I0313 14:04:13.074257 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:13 crc kubenswrapper[4898]: I0313 14:04:13.112384 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:13 crc kubenswrapper[4898]: I0313 14:04:13.878602 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zgjzn" podUID="cbb51f06-0778-4b18-82b5-c5ce91e0a613" containerName="registry-server" probeResult="failure" output=< Mar 13 14:04:13 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:04:13 crc kubenswrapper[4898]: > Mar 13 14:04:14 crc kubenswrapper[4898]: I0313 14:04:14.108830 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:19 crc kubenswrapper[4898]: I0313 14:04:19.134412 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:04:19 crc kubenswrapper[4898]: I0313 14:04:19.134733 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:04:19 crc kubenswrapper[4898]: I0313 14:04:19.134780 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:04:19 crc kubenswrapper[4898]: I0313 14:04:19.135364 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef8034867c7dd4fe3e16f610be3edcf45ba0ba5b7440cc5634ef7ce86e520b52"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:04:19 crc kubenswrapper[4898]: I0313 14:04:19.135424 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://ef8034867c7dd4fe3e16f610be3edcf45ba0ba5b7440cc5634ef7ce86e520b52" gracePeriod=600 Mar 13 14:04:20 crc kubenswrapper[4898]: I0313 14:04:20.084578 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="ef8034867c7dd4fe3e16f610be3edcf45ba0ba5b7440cc5634ef7ce86e520b52" exitCode=0 Mar 13 14:04:20 crc kubenswrapper[4898]: I0313 14:04:20.084624 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"ef8034867c7dd4fe3e16f610be3edcf45ba0ba5b7440cc5634ef7ce86e520b52"} Mar 13 14:04:20 crc kubenswrapper[4898]: I0313 14:04:20.084863 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"87afe240e3b86dba51997a01c599db519fabde9560e41dee3b537bab350f3092"} Mar 13 14:04:20 crc kubenswrapper[4898]: I0313 14:04:20.084890 4898 scope.go:117] "RemoveContainer" containerID="8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56" Mar 13 14:04:22 crc kubenswrapper[4898]: I0313 14:04:22.907945 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:22 crc kubenswrapper[4898]: I0313 14:04:22.969123 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.023388 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg"] Mar 13 14:04:52 crc kubenswrapper[4898]: E0313 14:04:52.024257 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd30282f-65c8-45d8-89f3-c6e2f16662d4" containerName="oc" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.024278 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd30282f-65c8-45d8-89f3-c6e2f16662d4" containerName="oc" Mar 13 14:04:52 crc kubenswrapper[4898]: E0313 14:04:52.024293 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08c305d-b9fc-4c5c-85c1-8281b9608bcf" containerName="registry" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.024304 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08c305d-b9fc-4c5c-85c1-8281b9608bcf" containerName="registry" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.024480 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b08c305d-b9fc-4c5c-85c1-8281b9608bcf" containerName="registry" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.024502 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd30282f-65c8-45d8-89f3-c6e2f16662d4" containerName="oc" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.025090 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.027992 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.028945 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.030409 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.030798 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.038939 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.052690 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg"] Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.101958 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2220cab0-84f3-4922-b02c-5d8f12977964-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.102009 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2220cab0-84f3-4922-b02c-5d8f12977964-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.102120 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpgbv\" (UniqueName: \"kubernetes.io/projected/2220cab0-84f3-4922-b02c-5d8f12977964-kube-api-access-mpgbv\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.203092 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpgbv\" (UniqueName: \"kubernetes.io/projected/2220cab0-84f3-4922-b02c-5d8f12977964-kube-api-access-mpgbv\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.203166 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2220cab0-84f3-4922-b02c-5d8f12977964-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.203201 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2220cab0-84f3-4922-b02c-5d8f12977964-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.205323 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2220cab0-84f3-4922-b02c-5d8f12977964-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.212122 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2220cab0-84f3-4922-b02c-5d8f12977964-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.227044 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpgbv\" (UniqueName: \"kubernetes.io/projected/2220cab0-84f3-4922-b02c-5d8f12977964-kube-api-access-mpgbv\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.364035 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.710814 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg"] Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.721966 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:04:53 crc kubenswrapper[4898]: I0313 14:04:53.269215 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" event={"ID":"2220cab0-84f3-4922-b02c-5d8f12977964","Type":"ContainerStarted","Data":"b7f230cc972a4c7d51b20e2fd5028954f8c1a42b11ede3bd192ab358d61ebcd3"} Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.002215 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt"] Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.003086 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.005437 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.006005 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-r4ldw" Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.016749 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt"] Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.137986 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/802396a8-633d-4f86-b77b-c25e9c76cc7a-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-4p7wt\" (UID: \"802396a8-633d-4f86-b77b-c25e9c76cc7a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.239365 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/802396a8-633d-4f86-b77b-c25e9c76cc7a-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-4p7wt\" (UID: \"802396a8-633d-4f86-b77b-c25e9c76cc7a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.244168 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/802396a8-633d-4f86-b77b-c25e9c76cc7a-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-4p7wt\" (UID: \"802396a8-633d-4f86-b77b-c25e9c76cc7a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.280150 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" event={"ID":"2220cab0-84f3-4922-b02c-5d8f12977964","Type":"ContainerStarted","Data":"b188a8d6bb2352db628138d15008095ba61d97259fa5815b5f6bd57c86bea0b0"} Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.296265 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" podStartSLOduration=2.609727623 podStartE2EDuration="4.296251043s" podCreationTimestamp="2026-03-13 14:04:51 +0000 UTC" firstStartedPulling="2026-03-13 14:04:52.721671384 +0000 UTC m=+527.723259643" lastFinishedPulling="2026-03-13 14:04:54.408194814 +0000 UTC m=+529.409783063" observedRunningTime="2026-03-13 14:04:55.294368304 +0000 UTC m=+530.295956553" watchObservedRunningTime="2026-03-13 14:04:55.296251043 +0000 UTC m=+530.297839282" Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.316136 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.521683 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt"] Mar 13 14:04:56 crc kubenswrapper[4898]: I0313 14:04:56.294838 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" event={"ID":"802396a8-633d-4f86-b77b-c25e9c76cc7a","Type":"ContainerStarted","Data":"9c6bbceaced98e6e3b8200c68801f3fceccd91684cb9fbe4d7870b1d45ee089b"} Mar 13 14:04:57 crc kubenswrapper[4898]: I0313 14:04:57.303180 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" event={"ID":"802396a8-633d-4f86-b77b-c25e9c76cc7a","Type":"ContainerStarted","Data":"67fef4a54132e1a45429506eabb64bc2f0135c568ba56a79d6372049a23edbc8"} Mar 13 14:04:57 crc kubenswrapper[4898]: I0313 14:04:57.303645 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" Mar 13 14:04:57 crc kubenswrapper[4898]: I0313 14:04:57.314710 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" Mar 13 14:04:57 crc kubenswrapper[4898]: I0313 14:04:57.324425 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" podStartSLOduration=2.013533207 podStartE2EDuration="3.324401316s" podCreationTimestamp="2026-03-13 14:04:54 +0000 UTC" firstStartedPulling="2026-03-13 14:04:55.531158458 +0000 UTC m=+530.532746697" lastFinishedPulling="2026-03-13 14:04:56.842026567 +0000 UTC m=+531.843614806" observedRunningTime="2026-03-13 14:04:57.323957954 +0000 UTC m=+532.325546223" watchObservedRunningTime="2026-03-13 14:04:57.324401316 +0000 UTC m=+532.325989595" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.083779 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-tqhcl"] Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.085854 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.090355 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-q4z7r" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.090655 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.090808 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.091549 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.092666 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-tqhcl"] Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.183630 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7b49\" (UniqueName: \"kubernetes.io/projected/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-kube-api-access-h7b49\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.183725 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.183754 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.183787 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-metrics-client-ca\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.285072 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7b49\" (UniqueName: \"kubernetes.io/projected/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-kube-api-access-h7b49\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.285547 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.285798 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.286065 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-metrics-client-ca\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.286955 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-metrics-client-ca\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.292238 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.296653 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.307655 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7b49\" (UniqueName: \"kubernetes.io/projected/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-kube-api-access-h7b49\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.402060 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.842074 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-tqhcl"] Mar 13 14:04:58 crc kubenswrapper[4898]: W0313 14:04:58.848415 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64542ec8_7d20_45ec_8e4f_8f5adcfb2c2b.slice/crio-f08a4b189e50f165c2ac13cee27546aac8fe8a1d5a6f40a3aec2567da98e4d3c WatchSource:0}: Error finding container f08a4b189e50f165c2ac13cee27546aac8fe8a1d5a6f40a3aec2567da98e4d3c: Status 404 returned error can't find the container with id f08a4b189e50f165c2ac13cee27546aac8fe8a1d5a6f40a3aec2567da98e4d3c Mar 13 14:04:59 crc kubenswrapper[4898]: I0313 14:04:59.318754 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" event={"ID":"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b","Type":"ContainerStarted","Data":"f08a4b189e50f165c2ac13cee27546aac8fe8a1d5a6f40a3aec2567da98e4d3c"} Mar 13 14:05:01 crc kubenswrapper[4898]: I0313 14:05:01.335416 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" event={"ID":"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b","Type":"ContainerStarted","Data":"db2a83a4c10efc2d1a302b2a613742214dea9a2f0f4a6b9987aca613fdbaad98"} Mar 13 14:05:01 crc kubenswrapper[4898]: I0313 14:05:01.338201 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" event={"ID":"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b","Type":"ContainerStarted","Data":"aeba8c5df189e23876e9ca8668ce79efbe2d30b2a56230f3ccaf934da7a40cba"} Mar 13 14:05:01 crc kubenswrapper[4898]: I0313 14:05:01.355953 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" podStartSLOduration=1.8778979 podStartE2EDuration="3.35592583s" podCreationTimestamp="2026-03-13 14:04:58 +0000 UTC" firstStartedPulling="2026-03-13 14:04:58.851113039 +0000 UTC m=+533.852701288" lastFinishedPulling="2026-03-13 14:05:00.329140969 +0000 UTC m=+535.330729218" observedRunningTime="2026-03-13 14:05:01.354832982 +0000 UTC m=+536.356421241" watchObservedRunningTime="2026-03-13 14:05:01.35592583 +0000 UTC m=+536.357514109" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.446510 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd"] Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.448106 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.450168 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.451083 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.451429 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-xnn9s" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.465960 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd"] Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.468982 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f"] Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.469920 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.472776 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.473497 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.473628 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-zk4th" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.474345 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478181 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478227 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478264 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05679ba1-ef84-46c5-803d-22379bb824dd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478289 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478312 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkj4b\" (UniqueName: \"kubernetes.io/projected/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-api-access-wkj4b\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478330 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478349 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05679ba1-ef84-46c5-803d-22379bb824dd-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478373 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/05679ba1-ef84-46c5-803d-22379bb824dd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478396 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478425 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcd22\" (UniqueName: \"kubernetes.io/projected/05679ba1-ef84-46c5-803d-22379bb824dd-kube-api-access-tcd22\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.490891 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-h4spr"] Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.491823 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.496212 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.496608 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.500077 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-xx4nn" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.508550 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f"] Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579394 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579461 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05679ba1-ef84-46c5-803d-22379bb824dd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579508 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579547 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3220aa6-97e3-4ea7-8959-fd0d11002f32-metrics-client-ca\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579580 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkj4b\" (UniqueName: \"kubernetes.io/projected/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-api-access-wkj4b\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579605 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579626 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05679ba1-ef84-46c5-803d-22379bb824dd-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579652 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-root\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579686 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/05679ba1-ef84-46c5-803d-22379bb824dd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579714 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579735 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579757 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klp29\" (UniqueName: \"kubernetes.io/projected/d3220aa6-97e3-4ea7-8959-fd0d11002f32-kube-api-access-klp29\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579782 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-sys\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579803 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcd22\" (UniqueName: \"kubernetes.io/projected/05679ba1-ef84-46c5-803d-22379bb824dd-kube-api-access-tcd22\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579831 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-textfile\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579862 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-tls\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579884 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579929 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-wtmp\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.580046 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: E0313 14:05:03.580140 4898 secret.go:188] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Mar 13 14:05:03 crc kubenswrapper[4898]: E0313 14:05:03.580210 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-tls podName:063cd9dd-e128-4dd5-af7b-a3b79b93c61a nodeName:}" failed. No retries permitted until 2026-03-13 14:05:04.080187546 +0000 UTC m=+539.081775785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-tls") pod "kube-state-metrics-777cb5bd5d-qnh2f" (UID: "063cd9dd-e128-4dd5-af7b-a3b79b93c61a") : secret "kube-state-metrics-tls" not found Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.581013 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.581019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.581027 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05679ba1-ef84-46c5-803d-22379bb824dd-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.587808 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05679ba1-ef84-46c5-803d-22379bb824dd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.591546 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/05679ba1-ef84-46c5-803d-22379bb824dd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.594355 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.601219 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkj4b\" (UniqueName: \"kubernetes.io/projected/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-api-access-wkj4b\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.603495 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcd22\" (UniqueName: \"kubernetes.io/projected/05679ba1-ef84-46c5-803d-22379bb824dd-kube-api-access-tcd22\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681364 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3220aa6-97e3-4ea7-8959-fd0d11002f32-metrics-client-ca\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681425 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-root\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681452 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681474 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klp29\" (UniqueName: \"kubernetes.io/projected/d3220aa6-97e3-4ea7-8959-fd0d11002f32-kube-api-access-klp29\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681492 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-sys\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681511 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-textfile\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681536 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-tls\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681554 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-wtmp\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681680 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-root\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681760 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-wtmp\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681813 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-sys\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.682034 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-textfile\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.682095 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3220aa6-97e3-4ea7-8959-fd0d11002f32-metrics-client-ca\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.684337 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-tls\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.698258 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klp29\" (UniqueName: \"kubernetes.io/projected/d3220aa6-97e3-4ea7-8959-fd0d11002f32-kube-api-access-klp29\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.699584 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.769051 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.805410 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.085259 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.090472 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.151348 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd"] Mar 13 14:05:04 crc kubenswrapper[4898]: W0313 14:05:04.157408 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05679ba1_ef84_46c5_803d_22379bb824dd.slice/crio-2f52a456328bec51af8db2165a61503468dfc52b489c55e0b7e01be76c95eb2e WatchSource:0}: Error finding container 2f52a456328bec51af8db2165a61503468dfc52b489c55e0b7e01be76c95eb2e: Status 404 returned error can't find the container with id 2f52a456328bec51af8db2165a61503468dfc52b489c55e0b7e01be76c95eb2e Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.351460 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h4spr" event={"ID":"d3220aa6-97e3-4ea7-8959-fd0d11002f32","Type":"ContainerStarted","Data":"21c8b6e4b700e648f1fdd42150db9b3759c42657a6c27a2795f3cd6529da1c09"} Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.352921 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" event={"ID":"05679ba1-ef84-46c5-803d-22379bb824dd","Type":"ContainerStarted","Data":"2586619c7d0dd6fbc1322feadf7055f3590cb5b1d46371a9eb057bf0a29befc8"} Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.352963 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" event={"ID":"05679ba1-ef84-46c5-803d-22379bb824dd","Type":"ContainerStarted","Data":"2f52a456328bec51af8db2165a61503468dfc52b489c55e0b7e01be76c95eb2e"} Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.383456 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.561089 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.562815 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.565815 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.566910 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.567101 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.567669 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593030 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-config-volume\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593065 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593090 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593108 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-config-out\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593127 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593141 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593161 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593180 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593205 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tknc\" (UniqueName: \"kubernetes.io/projected/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-kube-api-access-7tknc\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593221 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593239 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-web-config\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593256 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.600383 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.601316 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.602911 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-gkpzt" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.603061 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.620001 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.623295 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.694991 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tknc\" (UniqueName: \"kubernetes.io/projected/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-kube-api-access-7tknc\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695045 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695078 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-web-config\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695109 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695152 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-config-volume\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695169 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695190 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695210 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-config-out\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695236 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695259 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695291 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695321 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.696573 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.697060 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.697275 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.701570 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-web-config\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.701624 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-config-volume\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.701726 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.702114 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.702959 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.703674 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-config-out\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.721875 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.731458 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.732130 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tknc\" (UniqueName: \"kubernetes.io/projected/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-kube-api-access-7tknc\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.824176 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f"] Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.929389 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.358083 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" event={"ID":"05679ba1-ef84-46c5-803d-22379bb824dd","Type":"ContainerStarted","Data":"75ab1ac97a281c9bd3ebcc8290291a51649968261363be4992a9a961f9786e53"} Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.360015 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h4spr" event={"ID":"d3220aa6-97e3-4ea7-8959-fd0d11002f32","Type":"ContainerStarted","Data":"25b29f01adfa51a37e38274268b70502cdbef39be471bd5cb6aa12fb2ed45f71"} Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.361875 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" event={"ID":"063cd9dd-e128-4dd5-af7b-a3b79b93c61a","Type":"ContainerStarted","Data":"59238ce32ef31532c8d7e5cf21f8a8d4e41be0cc85ab7ad230443bcd6792c483"} Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.469055 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.514019 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp"] Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.516164 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.521143 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.521631 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.522943 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-6939h072j9qpn" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.522952 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-hpxqn" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.526375 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.526604 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.531533 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.531989 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp"] Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.609000 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.609049 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-tls\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.609072 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.609097 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-metrics-client-ca\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.609135 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-grpc-tls\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.609205 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stmhr\" (UniqueName: \"kubernetes.io/projected/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-kube-api-access-stmhr\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.609233 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.609265 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.710499 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stmhr\" (UniqueName: \"kubernetes.io/projected/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-kube-api-access-stmhr\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.710553 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.710617 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.710635 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.711122 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-tls\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.711267 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.711294 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-metrics-client-ca\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.711325 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-grpc-tls\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.712346 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.712408 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-metrics-client-ca\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.712549 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.712740 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.713141 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.713342 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-6939h072j9qpn" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.713376 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.726961 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-tls\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.727347 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.734100 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-grpc-tls\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.734460 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.734586 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.735510 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.741781 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stmhr\" (UniqueName: \"kubernetes.io/projected/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-kube-api-access-stmhr\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.848190 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-hpxqn" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.856886 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:06 crc kubenswrapper[4898]: I0313 14:05:06.370472 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" event={"ID":"05679ba1-ef84-46c5-803d-22379bb824dd","Type":"ContainerStarted","Data":"1abd3ecd125e08e504ab669fb6513e625bd9e4c7c236caa9ce8c1e0c8c7c46f3"} Mar 13 14:05:06 crc kubenswrapper[4898]: I0313 14:05:06.373157 4898 generic.go:334] "Generic (PLEG): container finished" podID="d3220aa6-97e3-4ea7-8959-fd0d11002f32" containerID="25b29f01adfa51a37e38274268b70502cdbef39be471bd5cb6aa12fb2ed45f71" exitCode=0 Mar 13 14:05:06 crc kubenswrapper[4898]: I0313 14:05:06.373231 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h4spr" event={"ID":"d3220aa6-97e3-4ea7-8959-fd0d11002f32","Type":"ContainerDied","Data":"25b29f01adfa51a37e38274268b70502cdbef39be471bd5cb6aa12fb2ed45f71"} Mar 13 14:05:06 crc kubenswrapper[4898]: I0313 14:05:06.375344 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b661f3a-62af-4aba-b8b3-e73b32d3da2d","Type":"ContainerStarted","Data":"44449a25c83109f4bd64c058609de98b6c854b7d955da87087986a77139e839c"} Mar 13 14:05:06 crc kubenswrapper[4898]: I0313 14:05:06.401326 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" podStartSLOduration=2.023513292 podStartE2EDuration="3.401307515s" podCreationTimestamp="2026-03-13 14:05:03 +0000 UTC" firstStartedPulling="2026-03-13 14:05:04.38487198 +0000 UTC m=+539.386460219" lastFinishedPulling="2026-03-13 14:05:05.762666203 +0000 UTC m=+540.764254442" observedRunningTime="2026-03-13 14:05:06.393784677 +0000 UTC m=+541.395372926" watchObservedRunningTime="2026-03-13 14:05:06.401307515 +0000 UTC m=+541.402895754" Mar 13 14:05:06 crc kubenswrapper[4898]: I0313 14:05:06.821519 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp"] Mar 13 14:05:07 crc kubenswrapper[4898]: I0313 14:05:07.383892 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h4spr" event={"ID":"d3220aa6-97e3-4ea7-8959-fd0d11002f32","Type":"ContainerStarted","Data":"05f0ba9a6501a12eb8926db452c6f724fbe7fef7b0c2645a67533c1d1046786a"} Mar 13 14:05:07 crc kubenswrapper[4898]: W0313 14:05:07.509421 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05b901e7_b9fc_4403_bcc2_8eeb2731c66f.slice/crio-12ab36caa7395831a23950389b9e637e57a337dae672f023cd9cda5e86131898 WatchSource:0}: Error finding container 12ab36caa7395831a23950389b9e637e57a337dae672f023cd9cda5e86131898: Status 404 returned error can't find the container with id 12ab36caa7395831a23950389b9e637e57a337dae672f023cd9cda5e86131898 Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.259204 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-875645f9-l5trk"] Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.260462 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.274753 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-875645f9-l5trk"] Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.350385 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-service-ca\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.350438 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-oauth-serving-cert\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.350507 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-serving-cert\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.350547 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-trusted-ca-bundle\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.350568 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fgxp\" (UniqueName: \"kubernetes.io/projected/441598c2-1b20-4109-8e38-46d414df93d7-kube-api-access-4fgxp\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.350589 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-oauth-config\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.350643 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-console-config\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.391994 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h4spr" event={"ID":"d3220aa6-97e3-4ea7-8959-fd0d11002f32","Type":"ContainerStarted","Data":"50c79a9d2b1c0f1e8be47aff36b9b1de95ecd73a11e9868c136f82fd89dbd95e"} Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.393228 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" event={"ID":"05b901e7-b9fc-4403-bcc2-8eeb2731c66f","Type":"ContainerStarted","Data":"12ab36caa7395831a23950389b9e637e57a337dae672f023cd9cda5e86131898"} Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.395454 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" event={"ID":"063cd9dd-e128-4dd5-af7b-a3b79b93c61a","Type":"ContainerStarted","Data":"92fe7f4e343c990e5dc068a2e3be1a740ffd10d64c2fff90fe3a3d5fd69afbca"} Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.395492 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" event={"ID":"063cd9dd-e128-4dd5-af7b-a3b79b93c61a","Type":"ContainerStarted","Data":"c65e95ec1ee4a5f8e453b99eca36b2d1d4006376063fd026a37c70230a16e549"} Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.395501 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" event={"ID":"063cd9dd-e128-4dd5-af7b-a3b79b93c61a","Type":"ContainerStarted","Data":"649d5b47768b12df54e2db464a178d257ba693270f5c4a3a9639d65496664fd1"} Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.403995 4898 generic.go:334] "Generic (PLEG): container finished" podID="7b661f3a-62af-4aba-b8b3-e73b32d3da2d" containerID="dbd117fda1cffb0b08da30191bbf5415043dc7562e59f0b60385d92842e64d53" exitCode=0 Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.404058 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b661f3a-62af-4aba-b8b3-e73b32d3da2d","Type":"ContainerDied","Data":"dbd117fda1cffb0b08da30191bbf5415043dc7562e59f0b60385d92842e64d53"} Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.413750 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-h4spr" podStartSLOduration=4.138837281 podStartE2EDuration="5.413725534s" podCreationTimestamp="2026-03-13 14:05:03 +0000 UTC" firstStartedPulling="2026-03-13 14:05:03.846746124 +0000 UTC m=+538.848334363" lastFinishedPulling="2026-03-13 14:05:05.121634377 +0000 UTC m=+540.123222616" observedRunningTime="2026-03-13 14:05:08.407184062 +0000 UTC m=+543.408772311" watchObservedRunningTime="2026-03-13 14:05:08.413725534 +0000 UTC m=+543.415313773" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.434154 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" podStartSLOduration=2.751333705 podStartE2EDuration="5.434136892s" podCreationTimestamp="2026-03-13 14:05:03 +0000 UTC" firstStartedPulling="2026-03-13 14:05:04.832620508 +0000 UTC m=+539.834208747" lastFinishedPulling="2026-03-13 14:05:07.515423695 +0000 UTC m=+542.517011934" observedRunningTime="2026-03-13 14:05:08.4238026 +0000 UTC m=+543.425390859" watchObservedRunningTime="2026-03-13 14:05:08.434136892 +0000 UTC m=+543.435725131" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.452306 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fgxp\" (UniqueName: \"kubernetes.io/projected/441598c2-1b20-4109-8e38-46d414df93d7-kube-api-access-4fgxp\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.452376 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-oauth-config\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.452406 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-console-config\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.452458 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-service-ca\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.452495 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-oauth-serving-cert\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.452584 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-serving-cert\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.452666 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-trusted-ca-bundle\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.453990 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-console-config\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.454028 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-oauth-serving-cert\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.454821 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-trusted-ca-bundle\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.455262 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-service-ca\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.459035 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-serving-cert\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.468674 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-oauth-config\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.490821 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fgxp\" (UniqueName: \"kubernetes.io/projected/441598c2-1b20-4109-8e38-46d414df93d7-kube-api-access-4fgxp\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.579045 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.757350 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr"] Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.758116 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.760652 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.760771 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.760789 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.760882 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-r5574" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.761208 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-412985lg3l7cn" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.761300 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.762036 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr"] Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.856274 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/823ccfb8-89eb-409e-9c6c-579bacb35ea1-audit-log\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.856326 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-secret-metrics-server-tls\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.856406 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/823ccfb8-89eb-409e-9c6c-579bacb35ea1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.856438 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-secret-metrics-client-certs\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.856491 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/823ccfb8-89eb-409e-9c6c-579bacb35ea1-metrics-server-audit-profiles\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.856539 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrmjw\" (UniqueName: \"kubernetes.io/projected/823ccfb8-89eb-409e-9c6c-579bacb35ea1-kube-api-access-hrmjw\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.856555 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-client-ca-bundle\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.951090 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-875645f9-l5trk"] Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.958449 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/823ccfb8-89eb-409e-9c6c-579bacb35ea1-audit-log\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.958519 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-secret-metrics-server-tls\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.958569 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/823ccfb8-89eb-409e-9c6c-579bacb35ea1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.958594 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-secret-metrics-client-certs\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.958679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/823ccfb8-89eb-409e-9c6c-579bacb35ea1-metrics-server-audit-profiles\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.958727 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrmjw\" (UniqueName: \"kubernetes.io/projected/823ccfb8-89eb-409e-9c6c-579bacb35ea1-kube-api-access-hrmjw\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.958750 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-client-ca-bundle\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.959486 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/823ccfb8-89eb-409e-9c6c-579bacb35ea1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.959835 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/823ccfb8-89eb-409e-9c6c-579bacb35ea1-audit-log\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.961320 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/823ccfb8-89eb-409e-9c6c-579bacb35ea1-metrics-server-audit-profiles\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.965029 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-client-ca-bundle\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.965715 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-secret-metrics-client-certs\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.965805 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-secret-metrics-server-tls\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.975610 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrmjw\" (UniqueName: \"kubernetes.io/projected/823ccfb8-89eb-409e-9c6c-579bacb35ea1-kube-api-access-hrmjw\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.081237 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.291623 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-595dc77696-pft4c"] Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.292472 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.294099 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.294506 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.305473 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-595dc77696-pft4c"] Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.365753 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10c7ab08-2341-4e85-ad67-8495e038afa2-monitoring-plugin-cert\") pod \"monitoring-plugin-595dc77696-pft4c\" (UID: \"10c7ab08-2341-4e85-ad67-8495e038afa2\") " pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.467034 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10c7ab08-2341-4e85-ad67-8495e038afa2-monitoring-plugin-cert\") pod \"monitoring-plugin-595dc77696-pft4c\" (UID: \"10c7ab08-2341-4e85-ad67-8495e038afa2\") " pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.471347 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10c7ab08-2341-4e85-ad67-8495e038afa2-monitoring-plugin-cert\") pod \"monitoring-plugin-595dc77696-pft4c\" (UID: \"10c7ab08-2341-4e85-ad67-8495e038afa2\") " pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.616961 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.718751 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.720740 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.726313 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.726342 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.726380 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-bje79e1t761so" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.726741 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.727505 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.727659 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.728052 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.729508 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-q952q" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.730550 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.730724 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.732934 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.736093 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.740089 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.762243 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771228 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771271 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-web-config\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771298 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771319 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771362 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771405 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771423 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771473 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771497 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771525 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771547 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1b691eb6-70f2-4fce-b18a-1d7712fddcac-config-out\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771566 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771587 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771606 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmss4\" (UniqueName: \"kubernetes.io/projected/1b691eb6-70f2-4fce-b18a-1d7712fddcac-kube-api-access-mmss4\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771628 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-config\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771657 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771695 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771739 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1b691eb6-70f2-4fce-b18a-1d7712fddcac-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872611 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872656 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872687 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872713 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872728 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872760 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872791 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872808 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872828 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872844 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1b691eb6-70f2-4fce-b18a-1d7712fddcac-config-out\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872871 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872891 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmss4\" (UniqueName: \"kubernetes.io/projected/1b691eb6-70f2-4fce-b18a-1d7712fddcac-kube-api-access-mmss4\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872931 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-config\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872959 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872982 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.873022 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1b691eb6-70f2-4fce-b18a-1d7712fddcac-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.873047 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.873068 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-web-config\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.874507 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.881284 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.881371 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.881552 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.886331 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.891910 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-config\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.895408 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.895433 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.895561 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.895881 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-web-config\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.896175 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.896309 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.896471 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.896490 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1b691eb6-70f2-4fce-b18a-1d7712fddcac-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.899114 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1b691eb6-70f2-4fce-b18a-1d7712fddcac-config-out\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.899428 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmss4\" (UniqueName: \"kubernetes.io/projected/1b691eb6-70f2-4fce-b18a-1d7712fddcac-kube-api-access-mmss4\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.899832 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.903578 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:10 crc kubenswrapper[4898]: I0313 14:05:10.039484 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:10 crc kubenswrapper[4898]: I0313 14:05:10.416018 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-875645f9-l5trk" event={"ID":"441598c2-1b20-4109-8e38-46d414df93d7","Type":"ContainerStarted","Data":"a20af1488f994221a262c4ea1f9370dd06f8b682113715c773da210306924f29"} Mar 13 14:05:10 crc kubenswrapper[4898]: I0313 14:05:10.917743 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-595dc77696-pft4c"] Mar 13 14:05:10 crc kubenswrapper[4898]: W0313 14:05:10.924870 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c7ab08_2341_4e85_ad67_8495e038afa2.slice/crio-62604b3b51de429492ff009f68bec3915d8f3984dc1a0a393b90594d41dfa2e4 WatchSource:0}: Error finding container 62604b3b51de429492ff009f68bec3915d8f3984dc1a0a393b90594d41dfa2e4: Status 404 returned error can't find the container with id 62604b3b51de429492ff009f68bec3915d8f3984dc1a0a393b90594d41dfa2e4 Mar 13 14:05:10 crc kubenswrapper[4898]: I0313 14:05:10.983720 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr"] Mar 13 14:05:10 crc kubenswrapper[4898]: W0313 14:05:10.985196 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod823ccfb8_89eb_409e_9c6c_579bacb35ea1.slice/crio-5e57710f6a55b2fa278ab74e2f83d33634aa2d785f588334ad38b0e4d9e493b0 WatchSource:0}: Error finding container 5e57710f6a55b2fa278ab74e2f83d33634aa2d785f588334ad38b0e4d9e493b0: Status 404 returned error can't find the container with id 5e57710f6a55b2fa278ab74e2f83d33634aa2d785f588334ad38b0e4d9e493b0 Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.003671 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 13 14:05:11 crc kubenswrapper[4898]: W0313 14:05:11.009764 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b691eb6_70f2_4fce_b18a_1d7712fddcac.slice/crio-fdbab1687a275cab6cb662d706cffe7544a58fc3fce7b082056bc90b056297a3 WatchSource:0}: Error finding container fdbab1687a275cab6cb662d706cffe7544a58fc3fce7b082056bc90b056297a3: Status 404 returned error can't find the container with id fdbab1687a275cab6cb662d706cffe7544a58fc3fce7b082056bc90b056297a3 Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.429801 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" event={"ID":"05b901e7-b9fc-4403-bcc2-8eeb2731c66f","Type":"ContainerStarted","Data":"2354e03dcd58942ac6f93cec12224a2608329553c2b2c49b8f91b03bc615fdbe"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.429862 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" event={"ID":"05b901e7-b9fc-4403-bcc2-8eeb2731c66f","Type":"ContainerStarted","Data":"e5a2b5f33a3f742aa6d7bab79a5aa9b38ee1d6d22ff47e353e2a8fb4d1aace32"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.429882 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" event={"ID":"05b901e7-b9fc-4403-bcc2-8eeb2731c66f","Type":"ContainerStarted","Data":"8185fba5582cf9528e5bb879c85b04febd0357bd5767c9de1b452c97fe9bed77"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.431042 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" event={"ID":"823ccfb8-89eb-409e-9c6c-579bacb35ea1","Type":"ContainerStarted","Data":"5e57710f6a55b2fa278ab74e2f83d33634aa2d785f588334ad38b0e4d9e493b0"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.432693 4898 generic.go:334] "Generic (PLEG): container finished" podID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerID="b90c3c464c21a8906faeff48a497518f1d7a734ad42f83941bb7d88e97c809e2" exitCode=0 Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.432785 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1b691eb6-70f2-4fce-b18a-1d7712fddcac","Type":"ContainerDied","Data":"b90c3c464c21a8906faeff48a497518f1d7a734ad42f83941bb7d88e97c809e2"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.432814 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1b691eb6-70f2-4fce-b18a-1d7712fddcac","Type":"ContainerStarted","Data":"fdbab1687a275cab6cb662d706cffe7544a58fc3fce7b082056bc90b056297a3"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.438644 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b661f3a-62af-4aba-b8b3-e73b32d3da2d","Type":"ContainerStarted","Data":"daed6cdf9dd863bd1d3b98928facf8c9e86c39b1f5150904deeaed4090d9f3e4"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.438816 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b661f3a-62af-4aba-b8b3-e73b32d3da2d","Type":"ContainerStarted","Data":"60a2fd79d3195519ab633d3a18a4153d52732eda93d09a994a6ee8458c9762e0"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.438978 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b661f3a-62af-4aba-b8b3-e73b32d3da2d","Type":"ContainerStarted","Data":"9ada23225913ee8e09566c2ff4c3b32ea83ff60825c4cbf648ccd7132d877700"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.439849 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b661f3a-62af-4aba-b8b3-e73b32d3da2d","Type":"ContainerStarted","Data":"91f77999fa41e31ae8d1ff8333545f16403f40662037b6574a30f4fad2c0d692"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.440000 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b661f3a-62af-4aba-b8b3-e73b32d3da2d","Type":"ContainerStarted","Data":"bc05756d607a017a68de11fdeaf501352b3479c54a1c0a39cdf9f5fdc1fce145"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.442749 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" event={"ID":"10c7ab08-2341-4e85-ad67-8495e038afa2","Type":"ContainerStarted","Data":"62604b3b51de429492ff009f68bec3915d8f3984dc1a0a393b90594d41dfa2e4"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.447167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-875645f9-l5trk" event={"ID":"441598c2-1b20-4109-8e38-46d414df93d7","Type":"ContainerStarted","Data":"5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.507002 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-875645f9-l5trk" podStartSLOduration=3.5069833470000003 podStartE2EDuration="3.506983347s" podCreationTimestamp="2026-03-13 14:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:05:11.502619662 +0000 UTC m=+546.504207901" watchObservedRunningTime="2026-03-13 14:05:11.506983347 +0000 UTC m=+546.508571596" Mar 13 14:05:12 crc kubenswrapper[4898]: I0313 14:05:12.458517 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b661f3a-62af-4aba-b8b3-e73b32d3da2d","Type":"ContainerStarted","Data":"dc23bdd0fbdab75d6e63ce7f7091dcecdeb9c29ac1cf9f0f6e51ce70f5a235a1"} Mar 13 14:05:12 crc kubenswrapper[4898]: I0313 14:05:12.464356 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" event={"ID":"05b901e7-b9fc-4403-bcc2-8eeb2731c66f","Type":"ContainerStarted","Data":"c65dc9b0d95e85171bc80fa0627031c341d8ffdf6c67b5d8c11c5a260352e7e4"} Mar 13 14:05:12 crc kubenswrapper[4898]: I0313 14:05:12.498155 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.275682339 podStartE2EDuration="8.498133321s" podCreationTimestamp="2026-03-13 14:05:04 +0000 UTC" firstStartedPulling="2026-03-13 14:05:05.728464872 +0000 UTC m=+540.730053131" lastFinishedPulling="2026-03-13 14:05:11.950915854 +0000 UTC m=+546.952504113" observedRunningTime="2026-03-13 14:05:12.490193372 +0000 UTC m=+547.491781621" watchObservedRunningTime="2026-03-13 14:05:12.498133321 +0000 UTC m=+547.499721560" Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.471172 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" event={"ID":"823ccfb8-89eb-409e-9c6c-579bacb35ea1","Type":"ContainerStarted","Data":"7046ebdd84c06a54a3ad07946403d79aec9442bfc8bcd95a1021d0c00e48bec6"} Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.474675 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" event={"ID":"10c7ab08-2341-4e85-ad67-8495e038afa2","Type":"ContainerStarted","Data":"3e8dba0dbf5089e4ed621f82ae8e6f8700c7b6faa9bab00d5ecd90cc41243753"} Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.474876 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.477457 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" event={"ID":"05b901e7-b9fc-4403-bcc2-8eeb2731c66f","Type":"ContainerStarted","Data":"7da0f284cffc4722d8a79cecda6ca0f2acc68edb2d757e6ca774bd68cf6bc831"} Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.477485 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" event={"ID":"05b901e7-b9fc-4403-bcc2-8eeb2731c66f","Type":"ContainerStarted","Data":"933b9b5cdfa995bc5a46b6a48dccbe6d92620d14da5454b07037495ad604a71c"} Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.477840 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.482431 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.492652 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" podStartSLOduration=3.488273644 podStartE2EDuration="5.492627211s" podCreationTimestamp="2026-03-13 14:05:08 +0000 UTC" firstStartedPulling="2026-03-13 14:05:10.987376068 +0000 UTC m=+545.988964307" lastFinishedPulling="2026-03-13 14:05:12.991729605 +0000 UTC m=+547.993317874" observedRunningTime="2026-03-13 14:05:13.486799818 +0000 UTC m=+548.488388097" watchObservedRunningTime="2026-03-13 14:05:13.492627211 +0000 UTC m=+548.494215450" Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.507461 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" podStartSLOduration=2.444549513 podStartE2EDuration="4.507442141s" podCreationTimestamp="2026-03-13 14:05:09 +0000 UTC" firstStartedPulling="2026-03-13 14:05:10.927663516 +0000 UTC m=+545.929251755" lastFinishedPulling="2026-03-13 14:05:12.990556144 +0000 UTC m=+547.992144383" observedRunningTime="2026-03-13 14:05:13.501764822 +0000 UTC m=+548.503353091" watchObservedRunningTime="2026-03-13 14:05:13.507442141 +0000 UTC m=+548.509030390" Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.538079 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" podStartSLOduration=4.10000212 podStartE2EDuration="8.538058297s" podCreationTimestamp="2026-03-13 14:05:05 +0000 UTC" firstStartedPulling="2026-03-13 14:05:07.511859391 +0000 UTC m=+542.513447630" lastFinishedPulling="2026-03-13 14:05:11.949915568 +0000 UTC m=+546.951503807" observedRunningTime="2026-03-13 14:05:13.535623193 +0000 UTC m=+548.537211442" watchObservedRunningTime="2026-03-13 14:05:13.538058297 +0000 UTC m=+548.539646546" Mar 13 14:05:15 crc kubenswrapper[4898]: I0313 14:05:15.875764 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:16 crc kubenswrapper[4898]: I0313 14:05:16.504702 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1b691eb6-70f2-4fce-b18a-1d7712fddcac","Type":"ContainerStarted","Data":"4619c06b3bd1e0f3a3a4c4b63644f93b8683cdc6ac65f1b6cf7811be33750093"} Mar 13 14:05:16 crc kubenswrapper[4898]: I0313 14:05:16.504753 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1b691eb6-70f2-4fce-b18a-1d7712fddcac","Type":"ContainerStarted","Data":"926ae0fab9f011fca5b7f408eeceb6d4898d2953e99a3f2832173a81b2a65936"} Mar 13 14:05:16 crc kubenswrapper[4898]: I0313 14:05:16.504772 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1b691eb6-70f2-4fce-b18a-1d7712fddcac","Type":"ContainerStarted","Data":"99ab08618d4bde097a16a7acf871a9e1d7b1680c779f6ce94a962af2fd6e6422"} Mar 13 14:05:16 crc kubenswrapper[4898]: I0313 14:05:16.504784 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1b691eb6-70f2-4fce-b18a-1d7712fddcac","Type":"ContainerStarted","Data":"ab8d8a67f9595d2bac345225aeb8ba86283c0942913cc66f6c73cce5aec24cbc"} Mar 13 14:05:16 crc kubenswrapper[4898]: I0313 14:05:16.504796 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1b691eb6-70f2-4fce-b18a-1d7712fddcac","Type":"ContainerStarted","Data":"44b263eb743483388a1167571e8517c9e29fe86f92b9750ec93ac830fad9f8bb"} Mar 13 14:05:16 crc kubenswrapper[4898]: I0313 14:05:16.504808 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1b691eb6-70f2-4fce-b18a-1d7712fddcac","Type":"ContainerStarted","Data":"ab590589ca17f1a027dfd0298e28f244a54c674de69f470ae27e2318f3e3e907"} Mar 13 14:05:16 crc kubenswrapper[4898]: I0313 14:05:16.556258 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.452908459 podStartE2EDuration="7.556230474s" podCreationTimestamp="2026-03-13 14:05:09 +0000 UTC" firstStartedPulling="2026-03-13 14:05:11.435456674 +0000 UTC m=+546.437044953" lastFinishedPulling="2026-03-13 14:05:15.538778729 +0000 UTC m=+550.540366968" observedRunningTime="2026-03-13 14:05:16.553453821 +0000 UTC m=+551.555042150" watchObservedRunningTime="2026-03-13 14:05:16.556230474 +0000 UTC m=+551.557818753" Mar 13 14:05:18 crc kubenswrapper[4898]: I0313 14:05:18.579946 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:18 crc kubenswrapper[4898]: I0313 14:05:18.580505 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:18 crc kubenswrapper[4898]: I0313 14:05:18.588143 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:19 crc kubenswrapper[4898]: I0313 14:05:19.536604 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:19 crc kubenswrapper[4898]: I0313 14:05:19.602645 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7l2pm"] Mar 13 14:05:20 crc kubenswrapper[4898]: I0313 14:05:20.040604 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:29 crc kubenswrapper[4898]: I0313 14:05:29.081727 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:29 crc kubenswrapper[4898]: I0313 14:05:29.084852 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:44 crc kubenswrapper[4898]: I0313 14:05:44.655205 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-7l2pm" podUID="0ea2e803-34d0-429b-b943-ece0b9e38b63" containerName="console" containerID="cri-o://5f0a71c3382b8e97b2f21cf59a246a72cf36bc90c37659a0655800a7772d93ae" gracePeriod=15 Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:45.742484 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7l2pm_0ea2e803-34d0-429b-b943-ece0b9e38b63/console/0.log" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:45.742862 4898 generic.go:334] "Generic (PLEG): container finished" podID="0ea2e803-34d0-429b-b943-ece0b9e38b63" containerID="5f0a71c3382b8e97b2f21cf59a246a72cf36bc90c37659a0655800a7772d93ae" exitCode=2 Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:45.751454 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7l2pm" event={"ID":"0ea2e803-34d0-429b-b943-ece0b9e38b63","Type":"ContainerDied","Data":"5f0a71c3382b8e97b2f21cf59a246a72cf36bc90c37659a0655800a7772d93ae"} Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.786870 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7l2pm_0ea2e803-34d0-429b-b943-ece0b9e38b63/console/0.log" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.787434 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.959494 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-oauth-serving-cert\") pod \"0ea2e803-34d0-429b-b943-ece0b9e38b63\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.959615 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq4w8\" (UniqueName: \"kubernetes.io/projected/0ea2e803-34d0-429b-b943-ece0b9e38b63-kube-api-access-gq4w8\") pod \"0ea2e803-34d0-429b-b943-ece0b9e38b63\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.959652 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-trusted-ca-bundle\") pod \"0ea2e803-34d0-429b-b943-ece0b9e38b63\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.959689 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-service-ca\") pod \"0ea2e803-34d0-429b-b943-ece0b9e38b63\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.959796 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-oauth-config\") pod \"0ea2e803-34d0-429b-b943-ece0b9e38b63\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.959842 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-serving-cert\") pod \"0ea2e803-34d0-429b-b943-ece0b9e38b63\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.959937 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-config\") pod \"0ea2e803-34d0-429b-b943-ece0b9e38b63\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.960553 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0ea2e803-34d0-429b-b943-ece0b9e38b63" (UID: "0ea2e803-34d0-429b-b943-ece0b9e38b63"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.961016 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-config" (OuterVolumeSpecName: "console-config") pod "0ea2e803-34d0-429b-b943-ece0b9e38b63" (UID: "0ea2e803-34d0-429b-b943-ece0b9e38b63"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.961264 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0ea2e803-34d0-429b-b943-ece0b9e38b63" (UID: "0ea2e803-34d0-429b-b943-ece0b9e38b63"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.961309 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-service-ca" (OuterVolumeSpecName: "service-ca") pod "0ea2e803-34d0-429b-b943-ece0b9e38b63" (UID: "0ea2e803-34d0-429b-b943-ece0b9e38b63"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.966178 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0ea2e803-34d0-429b-b943-ece0b9e38b63" (UID: "0ea2e803-34d0-429b-b943-ece0b9e38b63"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.967013 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea2e803-34d0-429b-b943-ece0b9e38b63-kube-api-access-gq4w8" (OuterVolumeSpecName: "kube-api-access-gq4w8") pod "0ea2e803-34d0-429b-b943-ece0b9e38b63" (UID: "0ea2e803-34d0-429b-b943-ece0b9e38b63"). InnerVolumeSpecName "kube-api-access-gq4w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.968098 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0ea2e803-34d0-429b-b943-ece0b9e38b63" (UID: "0ea2e803-34d0-429b-b943-ece0b9e38b63"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.067226 4898 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.067301 4898 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.067331 4898 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.067358 4898 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.067394 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq4w8\" (UniqueName: \"kubernetes.io/projected/0ea2e803-34d0-429b-b943-ece0b9e38b63-kube-api-access-gq4w8\") on node \"crc\" DevicePath \"\"" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.067422 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.067448 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.753207 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7l2pm_0ea2e803-34d0-429b-b943-ece0b9e38b63/console/0.log" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.753254 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7l2pm" event={"ID":"0ea2e803-34d0-429b-b943-ece0b9e38b63","Type":"ContainerDied","Data":"f80f9d0a69e3b6c8de8df5e105815c2ea6a5c4fed2a8e106511494e31c10c8bf"} Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.753288 4898 scope.go:117] "RemoveContainer" containerID="5f0a71c3382b8e97b2f21cf59a246a72cf36bc90c37659a0655800a7772d93ae" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.753330 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.788506 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7l2pm"] Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.792092 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-7l2pm"] Mar 13 14:05:49 crc kubenswrapper[4898]: I0313 14:05:49.089778 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:49 crc kubenswrapper[4898]: I0313 14:05:49.093668 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:49 crc kubenswrapper[4898]: I0313 14:05:49.748894 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea2e803-34d0-429b-b943-ece0b9e38b63" path="/var/lib/kubelet/pods/0ea2e803-34d0-429b-b943-ece0b9e38b63/volumes" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.144954 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556846-d6zpp"] Mar 13 14:06:00 crc kubenswrapper[4898]: E0313 14:06:00.146212 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea2e803-34d0-429b-b943-ece0b9e38b63" containerName="console" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.146245 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea2e803-34d0-429b-b943-ece0b9e38b63" containerName="console" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.146543 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea2e803-34d0-429b-b943-ece0b9e38b63" containerName="console" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.147417 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556846-d6zpp" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.150135 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.150553 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.152458 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.155705 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556846-d6zpp"] Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.265571 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhppp\" (UniqueName: \"kubernetes.io/projected/666e4c5d-e464-4b8a-b167-bc7624fc3e10-kube-api-access-dhppp\") pod \"auto-csr-approver-29556846-d6zpp\" (UID: \"666e4c5d-e464-4b8a-b167-bc7624fc3e10\") " pod="openshift-infra/auto-csr-approver-29556846-d6zpp" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.367535 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhppp\" (UniqueName: \"kubernetes.io/projected/666e4c5d-e464-4b8a-b167-bc7624fc3e10-kube-api-access-dhppp\") pod \"auto-csr-approver-29556846-d6zpp\" (UID: \"666e4c5d-e464-4b8a-b167-bc7624fc3e10\") " pod="openshift-infra/auto-csr-approver-29556846-d6zpp" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.402109 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhppp\" (UniqueName: \"kubernetes.io/projected/666e4c5d-e464-4b8a-b167-bc7624fc3e10-kube-api-access-dhppp\") pod \"auto-csr-approver-29556846-d6zpp\" (UID: \"666e4c5d-e464-4b8a-b167-bc7624fc3e10\") " pod="openshift-infra/auto-csr-approver-29556846-d6zpp" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.485481 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556846-d6zpp" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.988184 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556846-d6zpp"] Mar 13 14:06:01 crc kubenswrapper[4898]: I0313 14:06:01.869815 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556846-d6zpp" event={"ID":"666e4c5d-e464-4b8a-b167-bc7624fc3e10","Type":"ContainerStarted","Data":"b0dd93a1baf74a5036475b03d3ad28cf7c8996d6700490a4a760588232120ad7"} Mar 13 14:06:02 crc kubenswrapper[4898]: I0313 14:06:02.885257 4898 generic.go:334] "Generic (PLEG): container finished" podID="666e4c5d-e464-4b8a-b167-bc7624fc3e10" containerID="9c70e0bed8678da48508773f6b5163cca47cd975b196edd773fb1f955ef9672b" exitCode=0 Mar 13 14:06:02 crc kubenswrapper[4898]: I0313 14:06:02.885612 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556846-d6zpp" event={"ID":"666e4c5d-e464-4b8a-b167-bc7624fc3e10","Type":"ContainerDied","Data":"9c70e0bed8678da48508773f6b5163cca47cd975b196edd773fb1f955ef9672b"} Mar 13 14:06:04 crc kubenswrapper[4898]: I0313 14:06:04.211881 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556846-d6zpp" Mar 13 14:06:04 crc kubenswrapper[4898]: I0313 14:06:04.347603 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhppp\" (UniqueName: \"kubernetes.io/projected/666e4c5d-e464-4b8a-b167-bc7624fc3e10-kube-api-access-dhppp\") pod \"666e4c5d-e464-4b8a-b167-bc7624fc3e10\" (UID: \"666e4c5d-e464-4b8a-b167-bc7624fc3e10\") " Mar 13 14:06:04 crc kubenswrapper[4898]: I0313 14:06:04.353027 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/666e4c5d-e464-4b8a-b167-bc7624fc3e10-kube-api-access-dhppp" (OuterVolumeSpecName: "kube-api-access-dhppp") pod "666e4c5d-e464-4b8a-b167-bc7624fc3e10" (UID: "666e4c5d-e464-4b8a-b167-bc7624fc3e10"). InnerVolumeSpecName "kube-api-access-dhppp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:06:04 crc kubenswrapper[4898]: I0313 14:06:04.449531 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhppp\" (UniqueName: \"kubernetes.io/projected/666e4c5d-e464-4b8a-b167-bc7624fc3e10-kube-api-access-dhppp\") on node \"crc\" DevicePath \"\"" Mar 13 14:06:04 crc kubenswrapper[4898]: I0313 14:06:04.900092 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556846-d6zpp" event={"ID":"666e4c5d-e464-4b8a-b167-bc7624fc3e10","Type":"ContainerDied","Data":"b0dd93a1baf74a5036475b03d3ad28cf7c8996d6700490a4a760588232120ad7"} Mar 13 14:06:04 crc kubenswrapper[4898]: I0313 14:06:04.900148 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0dd93a1baf74a5036475b03d3ad28cf7c8996d6700490a4a760588232120ad7" Mar 13 14:06:04 crc kubenswrapper[4898]: I0313 14:06:04.900160 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556846-d6zpp" Mar 13 14:06:05 crc kubenswrapper[4898]: I0313 14:06:05.270652 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556840-vmqqn"] Mar 13 14:06:05 crc kubenswrapper[4898]: I0313 14:06:05.274935 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556840-vmqqn"] Mar 13 14:06:05 crc kubenswrapper[4898]: I0313 14:06:05.756185 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce" path="/var/lib/kubelet/pods/4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce/volumes" Mar 13 14:06:10 crc kubenswrapper[4898]: I0313 14:06:10.040747 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:06:10 crc kubenswrapper[4898]: I0313 14:06:10.089550 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:06:10 crc kubenswrapper[4898]: I0313 14:06:10.984864 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:06:19 crc kubenswrapper[4898]: I0313 14:06:19.134802 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:06:19 crc kubenswrapper[4898]: I0313 14:06:19.135243 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:06:49 crc kubenswrapper[4898]: I0313 14:06:49.134795 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:06:49 crc kubenswrapper[4898]: I0313 14:06:49.135633 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.371212 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-758c8fb5b-pxts9"] Mar 13 14:07:03 crc kubenswrapper[4898]: E0313 14:07:03.371851 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="666e4c5d-e464-4b8a-b167-bc7624fc3e10" containerName="oc" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.371863 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="666e4c5d-e464-4b8a-b167-bc7624fc3e10" containerName="oc" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.371991 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="666e4c5d-e464-4b8a-b167-bc7624fc3e10" containerName="oc" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.372411 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.380048 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-service-ca\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.380381 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-trusted-ca-bundle\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.380510 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-serving-cert\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.380683 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-oauth-serving-cert\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.380837 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twb6p\" (UniqueName: \"kubernetes.io/projected/571e1a76-1585-4c39-887c-d9c3f735a908-kube-api-access-twb6p\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.380994 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-oauth-config\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.381126 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-console-config\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.392338 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-758c8fb5b-pxts9"] Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.482540 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-oauth-serving-cert\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.482650 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twb6p\" (UniqueName: \"kubernetes.io/projected/571e1a76-1585-4c39-887c-d9c3f735a908-kube-api-access-twb6p\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.482714 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-oauth-config\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.482756 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-console-config\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.482792 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-service-ca\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.482855 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-trusted-ca-bundle\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.482923 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-serving-cert\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.484380 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-console-config\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.484674 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-oauth-serving-cert\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.484681 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-trusted-ca-bundle\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.497098 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-service-ca\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.500449 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-serving-cert\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.501330 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-oauth-config\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.525017 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twb6p\" (UniqueName: \"kubernetes.io/projected/571e1a76-1585-4c39-887c-d9c3f735a908-kube-api-access-twb6p\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.728870 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.995604 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-758c8fb5b-pxts9"] Mar 13 14:07:04 crc kubenswrapper[4898]: I0313 14:07:04.328505 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-758c8fb5b-pxts9" event={"ID":"571e1a76-1585-4c39-887c-d9c3f735a908","Type":"ContainerStarted","Data":"4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d"} Mar 13 14:07:04 crc kubenswrapper[4898]: I0313 14:07:04.328599 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-758c8fb5b-pxts9" event={"ID":"571e1a76-1585-4c39-887c-d9c3f735a908","Type":"ContainerStarted","Data":"4b09f73c7fa831fe94f3a344d5bf8593ff107c618a4ee0a2a0be061afa612208"} Mar 13 14:07:04 crc kubenswrapper[4898]: I0313 14:07:04.349647 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-758c8fb5b-pxts9" podStartSLOduration=1.349621098 podStartE2EDuration="1.349621098s" podCreationTimestamp="2026-03-13 14:07:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:07:04.346856936 +0000 UTC m=+659.348445205" watchObservedRunningTime="2026-03-13 14:07:04.349621098 +0000 UTC m=+659.351209367" Mar 13 14:07:13 crc kubenswrapper[4898]: I0313 14:07:13.730009 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:13 crc kubenswrapper[4898]: I0313 14:07:13.730690 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:13 crc kubenswrapper[4898]: I0313 14:07:13.738067 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:14 crc kubenswrapper[4898]: I0313 14:07:14.415765 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:14 crc kubenswrapper[4898]: I0313 14:07:14.497507 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-875645f9-l5trk"] Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.134531 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.134874 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.134959 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.135467 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"87afe240e3b86dba51997a01c599db519fabde9560e41dee3b537bab350f3092"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.135532 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://87afe240e3b86dba51997a01c599db519fabde9560e41dee3b537bab350f3092" gracePeriod=600 Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.450498 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="87afe240e3b86dba51997a01c599db519fabde9560e41dee3b537bab350f3092" exitCode=0 Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.450567 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"87afe240e3b86dba51997a01c599db519fabde9560e41dee3b537bab350f3092"} Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.450973 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"5a348cbe99f8e01e53545f65e722853afafc6c3cafe54ec4136fd0f288299e87"} Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.451007 4898 scope.go:117] "RemoveContainer" containerID="ef8034867c7dd4fe3e16f610be3edcf45ba0ba5b7440cc5634ef7ce86e520b52" Mar 13 14:07:32 crc kubenswrapper[4898]: I0313 14:07:32.123208 4898 scope.go:117] "RemoveContainer" containerID="f3acfddb5fa32ce7ed2202cdd792a2b1d7de4b1d204fbdc39e6814928f1b0f60" Mar 13 14:07:32 crc kubenswrapper[4898]: I0313 14:07:32.180361 4898 scope.go:117] "RemoveContainer" containerID="ce4b9269a12a5818cb6b78f9abcf90162aaab004f9bc6b1371639c51781f053a" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.545363 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-875645f9-l5trk" podUID="441598c2-1b20-4109-8e38-46d414df93d7" containerName="console" containerID="cri-o://5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575" gracePeriod=15 Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.869741 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-875645f9-l5trk_441598c2-1b20-4109-8e38-46d414df93d7/console/0.log" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.869823 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-875645f9-l5trk" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.939239 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fgxp\" (UniqueName: \"kubernetes.io/projected/441598c2-1b20-4109-8e38-46d414df93d7-kube-api-access-4fgxp\") pod \"441598c2-1b20-4109-8e38-46d414df93d7\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.939399 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-trusted-ca-bundle\") pod \"441598c2-1b20-4109-8e38-46d414df93d7\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.939443 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-service-ca\") pod \"441598c2-1b20-4109-8e38-46d414df93d7\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.939473 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-console-config\") pod \"441598c2-1b20-4109-8e38-46d414df93d7\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.939507 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-serving-cert\") pod \"441598c2-1b20-4109-8e38-46d414df93d7\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.939548 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-oauth-serving-cert\") pod \"441598c2-1b20-4109-8e38-46d414df93d7\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.939572 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-oauth-config\") pod \"441598c2-1b20-4109-8e38-46d414df93d7\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.940189 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-service-ca" (OuterVolumeSpecName: "service-ca") pod "441598c2-1b20-4109-8e38-46d414df93d7" (UID: "441598c2-1b20-4109-8e38-46d414df93d7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.940204 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-console-config" (OuterVolumeSpecName: "console-config") pod "441598c2-1b20-4109-8e38-46d414df93d7" (UID: "441598c2-1b20-4109-8e38-46d414df93d7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.940274 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "441598c2-1b20-4109-8e38-46d414df93d7" (UID: "441598c2-1b20-4109-8e38-46d414df93d7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.940603 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "441598c2-1b20-4109-8e38-46d414df93d7" (UID: "441598c2-1b20-4109-8e38-46d414df93d7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.945673 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "441598c2-1b20-4109-8e38-46d414df93d7" (UID: "441598c2-1b20-4109-8e38-46d414df93d7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.946174 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "441598c2-1b20-4109-8e38-46d414df93d7" (UID: "441598c2-1b20-4109-8e38-46d414df93d7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.947102 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/441598c2-1b20-4109-8e38-46d414df93d7-kube-api-access-4fgxp" (OuterVolumeSpecName: "kube-api-access-4fgxp") pod "441598c2-1b20-4109-8e38-46d414df93d7" (UID: "441598c2-1b20-4109-8e38-46d414df93d7"). InnerVolumeSpecName "kube-api-access-4fgxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.041403 4898 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.041802 4898 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.041820 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fgxp\" (UniqueName: \"kubernetes.io/projected/441598c2-1b20-4109-8e38-46d414df93d7-kube-api-access-4fgxp\") on node \"crc\" DevicePath \"\"" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.041838 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.041855 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.041871 4898 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.041886 4898 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.594479 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-875645f9-l5trk_441598c2-1b20-4109-8e38-46d414df93d7/console/0.log" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.594534 4898 generic.go:334] "Generic (PLEG): container finished" podID="441598c2-1b20-4109-8e38-46d414df93d7" containerID="5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575" exitCode=2 Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.594562 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-875645f9-l5trk" event={"ID":"441598c2-1b20-4109-8e38-46d414df93d7","Type":"ContainerDied","Data":"5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575"} Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.594588 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-875645f9-l5trk" event={"ID":"441598c2-1b20-4109-8e38-46d414df93d7","Type":"ContainerDied","Data":"a20af1488f994221a262c4ea1f9370dd06f8b682113715c773da210306924f29"} Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.594605 4898 scope.go:117] "RemoveContainer" containerID="5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.594684 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-875645f9-l5trk" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.619019 4898 scope.go:117] "RemoveContainer" containerID="5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575" Mar 13 14:07:40 crc kubenswrapper[4898]: E0313 14:07:40.619523 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575\": container with ID starting with 5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575 not found: ID does not exist" containerID="5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.619563 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575"} err="failed to get container status \"5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575\": rpc error: code = NotFound desc = could not find container \"5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575\": container with ID starting with 5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575 not found: ID does not exist" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.634801 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-875645f9-l5trk"] Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.647017 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-875645f9-l5trk"] Mar 13 14:07:41 crc kubenswrapper[4898]: I0313 14:07:41.751271 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="441598c2-1b20-4109-8e38-46d414df93d7" path="/var/lib/kubelet/pods/441598c2-1b20-4109-8e38-46d414df93d7/volumes" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.146372 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556848-wlplx"] Mar 13 14:08:00 crc kubenswrapper[4898]: E0313 14:08:00.148552 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="441598c2-1b20-4109-8e38-46d414df93d7" containerName="console" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.148605 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="441598c2-1b20-4109-8e38-46d414df93d7" containerName="console" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.148890 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="441598c2-1b20-4109-8e38-46d414df93d7" containerName="console" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.149595 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556848-wlplx" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.156471 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.157658 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.157815 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.158774 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556848-wlplx"] Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.267503 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcz6d\" (UniqueName: \"kubernetes.io/projected/fe4a848e-c06e-4205-a1a6-8b14b620096c-kube-api-access-kcz6d\") pod \"auto-csr-approver-29556848-wlplx\" (UID: \"fe4a848e-c06e-4205-a1a6-8b14b620096c\") " pod="openshift-infra/auto-csr-approver-29556848-wlplx" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.369199 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcz6d\" (UniqueName: \"kubernetes.io/projected/fe4a848e-c06e-4205-a1a6-8b14b620096c-kube-api-access-kcz6d\") pod \"auto-csr-approver-29556848-wlplx\" (UID: \"fe4a848e-c06e-4205-a1a6-8b14b620096c\") " pod="openshift-infra/auto-csr-approver-29556848-wlplx" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.395007 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcz6d\" (UniqueName: \"kubernetes.io/projected/fe4a848e-c06e-4205-a1a6-8b14b620096c-kube-api-access-kcz6d\") pod \"auto-csr-approver-29556848-wlplx\" (UID: \"fe4a848e-c06e-4205-a1a6-8b14b620096c\") " pod="openshift-infra/auto-csr-approver-29556848-wlplx" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.470208 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556848-wlplx" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.744975 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556848-wlplx"] Mar 13 14:08:01 crc kubenswrapper[4898]: I0313 14:08:01.760101 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556848-wlplx" event={"ID":"fe4a848e-c06e-4205-a1a6-8b14b620096c","Type":"ContainerStarted","Data":"de2f8c2990bed5b76782f89bf8e53ea37a9fd9e636b3c223ad6b21a6bc7324ef"} Mar 13 14:08:02 crc kubenswrapper[4898]: I0313 14:08:02.775705 4898 generic.go:334] "Generic (PLEG): container finished" podID="fe4a848e-c06e-4205-a1a6-8b14b620096c" containerID="e5c3875fd4b0ad4fd5d4afba4c88238837f0d8b510bd53eb8f51d4cd510b00e3" exitCode=0 Mar 13 14:08:02 crc kubenswrapper[4898]: I0313 14:08:02.776012 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556848-wlplx" event={"ID":"fe4a848e-c06e-4205-a1a6-8b14b620096c","Type":"ContainerDied","Data":"e5c3875fd4b0ad4fd5d4afba4c88238837f0d8b510bd53eb8f51d4cd510b00e3"} Mar 13 14:08:04 crc kubenswrapper[4898]: I0313 14:08:04.029666 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556848-wlplx" Mar 13 14:08:04 crc kubenswrapper[4898]: I0313 14:08:04.225689 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcz6d\" (UniqueName: \"kubernetes.io/projected/fe4a848e-c06e-4205-a1a6-8b14b620096c-kube-api-access-kcz6d\") pod \"fe4a848e-c06e-4205-a1a6-8b14b620096c\" (UID: \"fe4a848e-c06e-4205-a1a6-8b14b620096c\") " Mar 13 14:08:04 crc kubenswrapper[4898]: I0313 14:08:04.235319 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe4a848e-c06e-4205-a1a6-8b14b620096c-kube-api-access-kcz6d" (OuterVolumeSpecName: "kube-api-access-kcz6d") pod "fe4a848e-c06e-4205-a1a6-8b14b620096c" (UID: "fe4a848e-c06e-4205-a1a6-8b14b620096c"). InnerVolumeSpecName "kube-api-access-kcz6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:08:04 crc kubenswrapper[4898]: I0313 14:08:04.328633 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcz6d\" (UniqueName: \"kubernetes.io/projected/fe4a848e-c06e-4205-a1a6-8b14b620096c-kube-api-access-kcz6d\") on node \"crc\" DevicePath \"\"" Mar 13 14:08:04 crc kubenswrapper[4898]: I0313 14:08:04.795434 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556848-wlplx" event={"ID":"fe4a848e-c06e-4205-a1a6-8b14b620096c","Type":"ContainerDied","Data":"de2f8c2990bed5b76782f89bf8e53ea37a9fd9e636b3c223ad6b21a6bc7324ef"} Mar 13 14:08:04 crc kubenswrapper[4898]: I0313 14:08:04.795494 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de2f8c2990bed5b76782f89bf8e53ea37a9fd9e636b3c223ad6b21a6bc7324ef" Mar 13 14:08:04 crc kubenswrapper[4898]: I0313 14:08:04.795518 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556848-wlplx" Mar 13 14:08:05 crc kubenswrapper[4898]: I0313 14:08:05.117818 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556842-7h9s5"] Mar 13 14:08:05 crc kubenswrapper[4898]: I0313 14:08:05.125196 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556842-7h9s5"] Mar 13 14:08:05 crc kubenswrapper[4898]: I0313 14:08:05.751123 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a9b9a59-64ad-4602-88da-91583ec126dc" path="/var/lib/kubelet/pods/8a9b9a59-64ad-4602-88da-91583ec126dc/volumes" Mar 13 14:09:19 crc kubenswrapper[4898]: I0313 14:09:19.133872 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:09:19 crc kubenswrapper[4898]: I0313 14:09:19.134407 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.819224 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4"] Mar 13 14:09:20 crc kubenswrapper[4898]: E0313 14:09:20.819988 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe4a848e-c06e-4205-a1a6-8b14b620096c" containerName="oc" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.820006 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4a848e-c06e-4205-a1a6-8b14b620096c" containerName="oc" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.820139 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe4a848e-c06e-4205-a1a6-8b14b620096c" containerName="oc" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.821181 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.823484 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.838583 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4"] Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.882030 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2dcr\" (UniqueName: \"kubernetes.io/projected/dd46f989-e694-47a9-9b46-e96b7b47e403-kube-api-access-c2dcr\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.882110 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.882138 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.983654 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.983715 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.983848 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2dcr\" (UniqueName: \"kubernetes.io/projected/dd46f989-e694-47a9-9b46-e96b7b47e403-kube-api-access-c2dcr\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.984407 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.985927 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:21 crc kubenswrapper[4898]: I0313 14:09:21.005406 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2dcr\" (UniqueName: \"kubernetes.io/projected/dd46f989-e694-47a9-9b46-e96b7b47e403-kube-api-access-c2dcr\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:21 crc kubenswrapper[4898]: I0313 14:09:21.146269 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:21 crc kubenswrapper[4898]: I0313 14:09:21.397306 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4"] Mar 13 14:09:22 crc kubenswrapper[4898]: I0313 14:09:22.347335 4898 generic.go:334] "Generic (PLEG): container finished" podID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerID="6c588cd1db4036ae62f40d138136f3753bd0c24b2ce756684df9f722bb5a24c3" exitCode=0 Mar 13 14:09:22 crc kubenswrapper[4898]: I0313 14:09:22.347393 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" event={"ID":"dd46f989-e694-47a9-9b46-e96b7b47e403","Type":"ContainerDied","Data":"6c588cd1db4036ae62f40d138136f3753bd0c24b2ce756684df9f722bb5a24c3"} Mar 13 14:09:22 crc kubenswrapper[4898]: I0313 14:09:22.347453 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" event={"ID":"dd46f989-e694-47a9-9b46-e96b7b47e403","Type":"ContainerStarted","Data":"258e535120369cca7ef1d8dfbf0d7de2fe131e5eaa0fa710b99b16cb66c77d8d"} Mar 13 14:09:23 crc kubenswrapper[4898]: I0313 14:09:23.354521 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" event={"ID":"dd46f989-e694-47a9-9b46-e96b7b47e403","Type":"ContainerStarted","Data":"658051b6bb84e6ddac50ed2fb6e62204dec9baecebe51934fc668d4dd52a472d"} Mar 13 14:09:24 crc kubenswrapper[4898]: I0313 14:09:24.363633 4898 generic.go:334] "Generic (PLEG): container finished" podID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerID="658051b6bb84e6ddac50ed2fb6e62204dec9baecebe51934fc668d4dd52a472d" exitCode=0 Mar 13 14:09:24 crc kubenswrapper[4898]: I0313 14:09:24.363992 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" event={"ID":"dd46f989-e694-47a9-9b46-e96b7b47e403","Type":"ContainerDied","Data":"658051b6bb84e6ddac50ed2fb6e62204dec9baecebe51934fc668d4dd52a472d"} Mar 13 14:09:25 crc kubenswrapper[4898]: I0313 14:09:25.383992 4898 generic.go:334] "Generic (PLEG): container finished" podID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerID="b8f285efaaf0b4a74b49a55cc2187ddf19137e8651515ab85ed1254b531d18b9" exitCode=0 Mar 13 14:09:25 crc kubenswrapper[4898]: I0313 14:09:25.384047 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" event={"ID":"dd46f989-e694-47a9-9b46-e96b7b47e403","Type":"ContainerDied","Data":"b8f285efaaf0b4a74b49a55cc2187ddf19137e8651515ab85ed1254b531d18b9"} Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.671747 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.811400 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-util\") pod \"dd46f989-e694-47a9-9b46-e96b7b47e403\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.811674 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-bundle\") pod \"dd46f989-e694-47a9-9b46-e96b7b47e403\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.811720 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2dcr\" (UniqueName: \"kubernetes.io/projected/dd46f989-e694-47a9-9b46-e96b7b47e403-kube-api-access-c2dcr\") pod \"dd46f989-e694-47a9-9b46-e96b7b47e403\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.815738 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-bundle" (OuterVolumeSpecName: "bundle") pod "dd46f989-e694-47a9-9b46-e96b7b47e403" (UID: "dd46f989-e694-47a9-9b46-e96b7b47e403"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.820998 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd46f989-e694-47a9-9b46-e96b7b47e403-kube-api-access-c2dcr" (OuterVolumeSpecName: "kube-api-access-c2dcr") pod "dd46f989-e694-47a9-9b46-e96b7b47e403" (UID: "dd46f989-e694-47a9-9b46-e96b7b47e403"). InnerVolumeSpecName "kube-api-access-c2dcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.847082 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-util" (OuterVolumeSpecName: "util") pod "dd46f989-e694-47a9-9b46-e96b7b47e403" (UID: "dd46f989-e694-47a9-9b46-e96b7b47e403"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.913007 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-util\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.913040 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.913157 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2dcr\" (UniqueName: \"kubernetes.io/projected/dd46f989-e694-47a9-9b46-e96b7b47e403-kube-api-access-c2dcr\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:27 crc kubenswrapper[4898]: I0313 14:09:27.402865 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" event={"ID":"dd46f989-e694-47a9-9b46-e96b7b47e403","Type":"ContainerDied","Data":"258e535120369cca7ef1d8dfbf0d7de2fe131e5eaa0fa710b99b16cb66c77d8d"} Mar 13 14:09:27 crc kubenswrapper[4898]: I0313 14:09:27.403229 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="258e535120369cca7ef1d8dfbf0d7de2fe131e5eaa0fa710b99b16cb66c77d8d" Mar 13 14:09:27 crc kubenswrapper[4898]: I0313 14:09:27.402939 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.890408 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qqqs5"] Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.892094 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovn-controller" containerID="cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453" gracePeriod=30 Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.892189 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="nbdb" containerID="cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2" gracePeriod=30 Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.892308 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="sbdb" containerID="cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0" gracePeriod=30 Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.892348 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="northd" containerID="cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23" gracePeriod=30 Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.892315 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61" gracePeriod=30 Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.892429 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovn-acl-logging" containerID="cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345" gracePeriod=30 Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.892461 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kube-rbac-proxy-node" containerID="cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb" gracePeriod=30 Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.961440 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" containerID="cri-o://16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed" gracePeriod=30 Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.272540 4898 scope.go:117] "RemoveContainer" containerID="529194ce4ac0e19d00515e6fc6f6984803e8a03afdee3263ba4e434f1a13a57b" Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.298621 4898 scope.go:117] "RemoveContainer" containerID="5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022" Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.322962 4898 scope.go:117] "RemoveContainer" containerID="04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770" Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.439744 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovn-acl-logging/0.log" Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.440658 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovn-controller/0.log" Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.441366 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed" exitCode=0 Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.441570 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0" exitCode=0 Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.441731 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2" exitCode=0 Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.441937 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23" exitCode=0 Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.442102 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345" exitCode=143 Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.442255 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453" exitCode=143 Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.442515 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed"} Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.442749 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0"} Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.442959 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2"} Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.443126 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23"} Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.443266 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345"} Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.443529 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453"} Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.445952 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6llfs_e521c857-9711-4f68-886f-38b233d7b05b/kube-multus/2.log" Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.446132 4898 generic.go:334] "Generic (PLEG): container finished" podID="e521c857-9711-4f68-886f-38b233d7b05b" containerID="725f30c48676665ebc628a8b35e81161dc13d717e27cad14806022f5ad267e0e" exitCode=2 Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.446285 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6llfs" event={"ID":"e521c857-9711-4f68-886f-38b233d7b05b","Type":"ContainerDied","Data":"725f30c48676665ebc628a8b35e81161dc13d717e27cad14806022f5ad267e0e"} Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.447189 4898 scope.go:117] "RemoveContainer" containerID="725f30c48676665ebc628a8b35e81161dc13d717e27cad14806022f5ad267e0e" Mar 13 14:09:32 crc kubenswrapper[4898]: E0313 14:09:32.447689 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6llfs_openshift-multus(e521c857-9711-4f68-886f-38b233d7b05b)\"" pod="openshift-multus/multus-6llfs" podUID="e521c857-9711-4f68-886f-38b233d7b05b" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.188367 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovn-acl-logging/0.log" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.188781 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovn-controller/0.log" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.189199 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.207818 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-ovn-kubernetes\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208181 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-config\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208279 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-systemd\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208359 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-netns\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208439 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-netd\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208523 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-script-lib\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208613 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-node-log\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208716 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.207969 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208651 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208723 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208777 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-node-log" (OuterVolumeSpecName: "node-log") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208806 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-systemd-units\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209055 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovn-node-metrics-cert\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209105 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-var-lib-openvswitch\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209131 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-etc-openvswitch\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209167 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-openvswitch\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209203 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc944\" (UniqueName: \"kubernetes.io/projected/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-kube-api-access-tc944\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209240 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-bin\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209254 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209254 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209297 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209343 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209315 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209325 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209305 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-ovn\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209440 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-env-overrides\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209471 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-slash\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209545 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-kubelet\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209596 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-log-socket\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209868 4898 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209917 4898 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209929 4898 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209939 4898 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209949 4898 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209957 4898 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209965 4898 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209982 4898 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209992 4898 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-node-log\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.210001 4898 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.210022 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.210031 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-log-socket" (OuterVolumeSpecName: "log-socket") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.210052 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.210133 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-slash" (OuterVolumeSpecName: "host-slash") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.210474 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.210496 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.210693 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.215556 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.215681 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-kube-api-access-tc944" (OuterVolumeSpecName: "kube-api-access-tc944") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "kube-api-access-tc944". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.237500 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243365 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l82wk"] Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243637 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kubecfg-setup" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243653 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kubecfg-setup" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243664 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243671 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243683 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="nbdb" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243717 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="nbdb" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243726 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243733 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243741 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovn-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243748 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovn-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243759 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kube-rbac-proxy-node" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243766 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kube-rbac-proxy-node" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243779 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovn-acl-logging" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243786 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovn-acl-logging" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243804 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerName="util" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243811 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerName="util" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243818 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243825 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243838 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="sbdb" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243845 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="sbdb" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243856 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerName="extract" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243862 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerName="extract" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243871 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="northd" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243877 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="northd" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243887 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerName="pull" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243893 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerName="pull" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243920 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243927 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243939 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243946 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244072 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244098 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovn-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244111 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244122 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerName="extract" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244132 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244156 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovn-acl-logging" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244163 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244174 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="sbdb" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244183 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="northd" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244197 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kube-rbac-proxy-node" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244209 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244224 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="nbdb" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.244380 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244392 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244552 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.247408 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310771 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-etc-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310819 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310852 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-var-lib-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310871 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-cni-bin\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310914 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-slash\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310933 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovnkube-script-lib\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310948 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-systemd\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310964 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-ovn\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310982 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-systemd-units\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311005 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-env-overrides\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311030 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-cni-netd\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311048 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-run-netns\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311065 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311086 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311103 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovn-node-metrics-cert\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311117 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrwpw\" (UniqueName: \"kubernetes.io/projected/4a0b9ad6-156f-418b-8eae-1d762f8161dd-kube-api-access-xrwpw\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311135 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-kubelet\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311150 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-log-socket\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311165 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovnkube-config\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311182 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-node-log\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311217 4898 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311241 4898 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311251 4898 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311259 4898 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311268 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc944\" (UniqueName: \"kubernetes.io/projected/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-kube-api-access-tc944\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311277 4898 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311285 4898 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311293 4898 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-slash\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311301 4898 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311308 4898 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-log-socket\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.412982 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-env-overrides\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413341 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-cni-netd\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413379 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-run-netns\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413406 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413439 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413462 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovn-node-metrics-cert\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413507 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrwpw\" (UniqueName: \"kubernetes.io/projected/4a0b9ad6-156f-418b-8eae-1d762f8161dd-kube-api-access-xrwpw\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413531 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-kubelet\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413563 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-log-socket\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413585 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovnkube-config\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413613 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-node-log\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413645 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-etc-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413670 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413711 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-var-lib-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413735 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-cni-bin\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413749 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-env-overrides\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413766 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-slash\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413791 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovnkube-script-lib\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413813 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-systemd\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413837 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-ovn\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413853 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-run-netns\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413863 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-systemd-units\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413881 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414207 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-kubelet\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414236 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-slash\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413816 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-log-socket\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413836 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-cni-netd\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414612 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414640 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-systemd\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414660 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-ovn\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414663 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-var-lib-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414684 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-systemd-units\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414699 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-node-log\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414709 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414708 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovnkube-config\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414720 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-etc-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414741 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-cni-bin\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.415191 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovnkube-script-lib\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.416990 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovn-node-metrics-cert\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.433304 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrwpw\" (UniqueName: \"kubernetes.io/projected/4a0b9ad6-156f-418b-8eae-1d762f8161dd-kube-api-access-xrwpw\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.455257 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovn-acl-logging/0.log" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.455854 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovn-controller/0.log" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.456169 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61" exitCode=0 Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.456194 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb" exitCode=0 Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.456214 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61"} Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.456238 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb"} Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.456249 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"064d66ce778a8d0d979727a052c6e1249a726f86c9609bd927debcbbf5923b70"} Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.456264 4898 scope.go:117] "RemoveContainer" containerID="16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.456403 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.478284 4898 scope.go:117] "RemoveContainer" containerID="86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.495215 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qqqs5"] Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.502410 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qqqs5"] Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.502572 4898 scope.go:117] "RemoveContainer" containerID="d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.518509 4898 scope.go:117] "RemoveContainer" containerID="d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.540353 4898 scope.go:117] "RemoveContainer" containerID="3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.552811 4898 scope.go:117] "RemoveContainer" containerID="0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.563480 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.573674 4898 scope.go:117] "RemoveContainer" containerID="7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.593182 4898 scope.go:117] "RemoveContainer" containerID="14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.610463 4898 scope.go:117] "RemoveContainer" containerID="dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.630521 4898 scope.go:117] "RemoveContainer" containerID="16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.634275 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed\": container with ID starting with 16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed not found: ID does not exist" containerID="16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.634327 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed"} err="failed to get container status \"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed\": rpc error: code = NotFound desc = could not find container \"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed\": container with ID starting with 16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.634366 4898 scope.go:117] "RemoveContainer" containerID="86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.635137 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\": container with ID starting with 86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0 not found: ID does not exist" containerID="86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.635162 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0"} err="failed to get container status \"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\": rpc error: code = NotFound desc = could not find container \"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\": container with ID starting with 86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.635181 4898 scope.go:117] "RemoveContainer" containerID="d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.636248 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\": container with ID starting with d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2 not found: ID does not exist" containerID="d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.636279 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2"} err="failed to get container status \"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\": rpc error: code = NotFound desc = could not find container \"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\": container with ID starting with d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.636297 4898 scope.go:117] "RemoveContainer" containerID="d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.636528 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\": container with ID starting with d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23 not found: ID does not exist" containerID="d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.636555 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23"} err="failed to get container status \"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\": rpc error: code = NotFound desc = could not find container \"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\": container with ID starting with d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.636572 4898 scope.go:117] "RemoveContainer" containerID="3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.636948 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\": container with ID starting with 3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61 not found: ID does not exist" containerID="3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.636974 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61"} err="failed to get container status \"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\": rpc error: code = NotFound desc = could not find container \"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\": container with ID starting with 3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.636988 4898 scope.go:117] "RemoveContainer" containerID="0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.637346 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\": container with ID starting with 0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb not found: ID does not exist" containerID="0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.637371 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb"} err="failed to get container status \"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\": rpc error: code = NotFound desc = could not find container \"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\": container with ID starting with 0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.637390 4898 scope.go:117] "RemoveContainer" containerID="7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.637696 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\": container with ID starting with 7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345 not found: ID does not exist" containerID="7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.637720 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345"} err="failed to get container status \"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\": rpc error: code = NotFound desc = could not find container \"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\": container with ID starting with 7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.637735 4898 scope.go:117] "RemoveContainer" containerID="14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.638957 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\": container with ID starting with 14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453 not found: ID does not exist" containerID="14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.638991 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453"} err="failed to get container status \"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\": rpc error: code = NotFound desc = could not find container \"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\": container with ID starting with 14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.639011 4898 scope.go:117] "RemoveContainer" containerID="dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.639331 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\": container with ID starting with dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786 not found: ID does not exist" containerID="dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.639358 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786"} err="failed to get container status \"dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\": rpc error: code = NotFound desc = could not find container \"dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\": container with ID starting with dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.639375 4898 scope.go:117] "RemoveContainer" containerID="16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.639610 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed"} err="failed to get container status \"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed\": rpc error: code = NotFound desc = could not find container \"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed\": container with ID starting with 16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.639634 4898 scope.go:117] "RemoveContainer" containerID="86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.639864 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0"} err="failed to get container status \"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\": rpc error: code = NotFound desc = could not find container \"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\": container with ID starting with 86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.639888 4898 scope.go:117] "RemoveContainer" containerID="d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.640141 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2"} err="failed to get container status \"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\": rpc error: code = NotFound desc = could not find container \"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\": container with ID starting with d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.640164 4898 scope.go:117] "RemoveContainer" containerID="d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.640396 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23"} err="failed to get container status \"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\": rpc error: code = NotFound desc = could not find container \"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\": container with ID starting with d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.640418 4898 scope.go:117] "RemoveContainer" containerID="3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.640677 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61"} err="failed to get container status \"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\": rpc error: code = NotFound desc = could not find container \"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\": container with ID starting with 3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.640700 4898 scope.go:117] "RemoveContainer" containerID="0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.640980 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb"} err="failed to get container status \"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\": rpc error: code = NotFound desc = could not find container \"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\": container with ID starting with 0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.641007 4898 scope.go:117] "RemoveContainer" containerID="7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.641256 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345"} err="failed to get container status \"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\": rpc error: code = NotFound desc = could not find container \"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\": container with ID starting with 7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.641283 4898 scope.go:117] "RemoveContainer" containerID="14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.641500 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453"} err="failed to get container status \"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\": rpc error: code = NotFound desc = could not find container \"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\": container with ID starting with 14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.641525 4898 scope.go:117] "RemoveContainer" containerID="dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.641735 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786"} err="failed to get container status \"dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\": rpc error: code = NotFound desc = could not find container \"dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\": container with ID starting with dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.751122 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" path="/var/lib/kubelet/pods/e7d6afc0-d9b5-41b2-a55f-57621c300cbb/volumes" Mar 13 14:09:34 crc kubenswrapper[4898]: I0313 14:09:34.463357 4898 generic.go:334] "Generic (PLEG): container finished" podID="4a0b9ad6-156f-418b-8eae-1d762f8161dd" containerID="d0fa021e24bbaf323086d2f0ca9344418651c6408325675373378bef04786f4d" exitCode=0 Mar 13 14:09:34 crc kubenswrapper[4898]: I0313 14:09:34.463449 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerDied","Data":"d0fa021e24bbaf323086d2f0ca9344418651c6408325675373378bef04786f4d"} Mar 13 14:09:34 crc kubenswrapper[4898]: I0313 14:09:34.463781 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"2165dbbf05ae9ac0b2d925a7a188178859abd2b34f765e588a20716431a9c72e"} Mar 13 14:09:35 crc kubenswrapper[4898]: I0313 14:09:35.474426 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"34a580f53162cf26e9e4d69b61e7c5aaae36bc62a1676dc79dbaefa0ee348097"} Mar 13 14:09:35 crc kubenswrapper[4898]: I0313 14:09:35.474733 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"e29c8f382fd2b10cf7fdcdc74768e35f87c892171068990ee4300be25ea3784b"} Mar 13 14:09:35 crc kubenswrapper[4898]: I0313 14:09:35.474745 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"8bf965473f4dd435e52cc0ad785b214411845f1a2d4651d06b70fc0af1a48f02"} Mar 13 14:09:35 crc kubenswrapper[4898]: I0313 14:09:35.474754 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"19e7ccc92da0c8ab9f95b946661f1c36140aa2c235919fa28e61d51f0b9c3944"} Mar 13 14:09:35 crc kubenswrapper[4898]: I0313 14:09:35.474765 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"3d5102150d4c4f9095f479599cfdfcdc3cf96da057be9255999245de491d00dc"} Mar 13 14:09:35 crc kubenswrapper[4898]: I0313 14:09:35.474774 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"c8044828df8054b6f2b1f1906a92ec5c8265734ed70cf3907b5975d90f9eff88"} Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.498328 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"c319424207abd7c622d86ae3eed7bd449515b03b902c401221c8282bb978ec98"} Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.600179 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm"] Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.601121 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.603365 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.603419 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-bjwfj" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.603770 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.649268 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm"] Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.650005 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.651702 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.651930 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-8qwjn" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.656720 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz"] Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.657423 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.677042 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx8t5\" (UniqueName: \"kubernetes.io/projected/30c06063-b926-4f2e-b8d1-8c530cc5b0a9-kube-api-access-fx8t5\") pod \"obo-prometheus-operator-68bc856cb9-5r9gm\" (UID: \"30c06063-b926-4f2e-b8d1-8c530cc5b0a9\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.778198 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/951cfcfc-3a8c-410e-a3f5-f5caa10511f5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz\" (UID: \"951cfcfc-3a8c-410e-a3f5-f5caa10511f5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.778498 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c190eee-747b-4a45-905c-fa0235080305-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm\" (UID: \"8c190eee-747b-4a45-905c-fa0235080305\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.778556 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx8t5\" (UniqueName: \"kubernetes.io/projected/30c06063-b926-4f2e-b8d1-8c530cc5b0a9-kube-api-access-fx8t5\") pod \"obo-prometheus-operator-68bc856cb9-5r9gm\" (UID: \"30c06063-b926-4f2e-b8d1-8c530cc5b0a9\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.778610 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c190eee-747b-4a45-905c-fa0235080305-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm\" (UID: \"8c190eee-747b-4a45-905c-fa0235080305\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.778638 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/951cfcfc-3a8c-410e-a3f5-f5caa10511f5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz\" (UID: \"951cfcfc-3a8c-410e-a3f5-f5caa10511f5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.807926 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx8t5\" (UniqueName: \"kubernetes.io/projected/30c06063-b926-4f2e-b8d1-8c530cc5b0a9-kube-api-access-fx8t5\") pod \"obo-prometheus-operator-68bc856cb9-5r9gm\" (UID: \"30c06063-b926-4f2e-b8d1-8c530cc5b0a9\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.841306 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-ljrtz"] Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.842242 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.844608 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.844853 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-mflt7" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.879992 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c190eee-747b-4a45-905c-fa0235080305-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm\" (UID: \"8c190eee-747b-4a45-905c-fa0235080305\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.880063 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/951cfcfc-3a8c-410e-a3f5-f5caa10511f5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz\" (UID: \"951cfcfc-3a8c-410e-a3f5-f5caa10511f5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.880679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/951cfcfc-3a8c-410e-a3f5-f5caa10511f5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz\" (UID: \"951cfcfc-3a8c-410e-a3f5-f5caa10511f5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.880786 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c190eee-747b-4a45-905c-fa0235080305-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm\" (UID: \"8c190eee-747b-4a45-905c-fa0235080305\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.887859 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/951cfcfc-3a8c-410e-a3f5-f5caa10511f5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz\" (UID: \"951cfcfc-3a8c-410e-a3f5-f5caa10511f5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.888943 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c190eee-747b-4a45-905c-fa0235080305-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm\" (UID: \"8c190eee-747b-4a45-905c-fa0235080305\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.891367 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c190eee-747b-4a45-905c-fa0235080305-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm\" (UID: \"8c190eee-747b-4a45-905c-fa0235080305\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.891758 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/951cfcfc-3a8c-410e-a3f5-f5caa10511f5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz\" (UID: \"951cfcfc-3a8c-410e-a3f5-f5caa10511f5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.916337 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:38 crc kubenswrapper[4898]: E0313 14:09:38.944527 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators_30c06063-b926-4f2e-b8d1-8c530cc5b0a9_0(fd78077a86f3997d29fd985fce1622f05819a7419fc771677e5a2e01de020bb1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:38 crc kubenswrapper[4898]: E0313 14:09:38.944603 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators_30c06063-b926-4f2e-b8d1-8c530cc5b0a9_0(fd78077a86f3997d29fd985fce1622f05819a7419fc771677e5a2e01de020bb1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:38 crc kubenswrapper[4898]: E0313 14:09:38.944628 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators_30c06063-b926-4f2e-b8d1-8c530cc5b0a9_0(fd78077a86f3997d29fd985fce1622f05819a7419fc771677e5a2e01de020bb1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:38 crc kubenswrapper[4898]: E0313 14:09:38.944693 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators(30c06063-b926-4f2e-b8d1-8c530cc5b0a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators(30c06063-b926-4f2e-b8d1-8c530cc5b0a9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators_30c06063-b926-4f2e-b8d1-8c530cc5b0a9_0(fd78077a86f3997d29fd985fce1622f05819a7419fc771677e5a2e01de020bb1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" podUID="30c06063-b926-4f2e-b8d1-8c530cc5b0a9" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.974229 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.982780 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bfc0332-bb59-42bf-bb70-462efa225c81-observability-operator-tls\") pod \"observability-operator-59bdc8b94-ljrtz\" (UID: \"3bfc0332-bb59-42bf-bb70-462efa225c81\") " pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.982873 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxlrb\" (UniqueName: \"kubernetes.io/projected/3bfc0332-bb59-42bf-bb70-462efa225c81-kube-api-access-qxlrb\") pod \"observability-operator-59bdc8b94-ljrtz\" (UID: \"3bfc0332-bb59-42bf-bb70-462efa225c81\") " pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.993826 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-nkt76"] Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.993889 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:38 crc kubenswrapper[4898]: E0313 14:09:38.995043 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators_8c190eee-747b-4a45-905c-fa0235080305_0(434f5ab9b2b2b566b310d245370ebef6cfceab68a771f8cacfe211a475c0830e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:38 crc kubenswrapper[4898]: E0313 14:09:38.995081 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators_8c190eee-747b-4a45-905c-fa0235080305_0(434f5ab9b2b2b566b310d245370ebef6cfceab68a771f8cacfe211a475c0830e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: E0313 14:09:38.995097 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators_8c190eee-747b-4a45-905c-fa0235080305_0(434f5ab9b2b2b566b310d245370ebef6cfceab68a771f8cacfe211a475c0830e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: E0313 14:09:38.995132 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators(8c190eee-747b-4a45-905c-fa0235080305)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators(8c190eee-747b-4a45-905c-fa0235080305)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators_8c190eee-747b-4a45-905c-fa0235080305_0(434f5ab9b2b2b566b310d245370ebef6cfceab68a771f8cacfe211a475c0830e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" podUID="8c190eee-747b-4a45-905c-fa0235080305" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.007673 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.010018 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-glbkw" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.020048 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators_951cfcfc-3a8c-410e-a3f5-f5caa10511f5_0(340dfeabe1c085bb98589e56ab7efcfddb37980949c9fe6bc883f55a4c2e7dae): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.020094 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators_951cfcfc-3a8c-410e-a3f5-f5caa10511f5_0(340dfeabe1c085bb98589e56ab7efcfddb37980949c9fe6bc883f55a4c2e7dae): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.020116 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators_951cfcfc-3a8c-410e-a3f5-f5caa10511f5_0(340dfeabe1c085bb98589e56ab7efcfddb37980949c9fe6bc883f55a4c2e7dae): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.020159 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators(951cfcfc-3a8c-410e-a3f5-f5caa10511f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators(951cfcfc-3a8c-410e-a3f5-f5caa10511f5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators_951cfcfc-3a8c-410e-a3f5-f5caa10511f5_0(340dfeabe1c085bb98589e56ab7efcfddb37980949c9fe6bc883f55a4c2e7dae): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" podUID="951cfcfc-3a8c-410e-a3f5-f5caa10511f5" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.084756 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv8gd\" (UniqueName: \"kubernetes.io/projected/79ead8ee-67ba-4831-b5d4-a1f128e94334-kube-api-access-nv8gd\") pod \"perses-operator-5bf474d74f-nkt76\" (UID: \"79ead8ee-67ba-4831-b5d4-a1f128e94334\") " pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.084828 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bfc0332-bb59-42bf-bb70-462efa225c81-observability-operator-tls\") pod \"observability-operator-59bdc8b94-ljrtz\" (UID: \"3bfc0332-bb59-42bf-bb70-462efa225c81\") " pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.084870 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxlrb\" (UniqueName: \"kubernetes.io/projected/3bfc0332-bb59-42bf-bb70-462efa225c81-kube-api-access-qxlrb\") pod \"observability-operator-59bdc8b94-ljrtz\" (UID: \"3bfc0332-bb59-42bf-bb70-462efa225c81\") " pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.084913 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/79ead8ee-67ba-4831-b5d4-a1f128e94334-openshift-service-ca\") pod \"perses-operator-5bf474d74f-nkt76\" (UID: \"79ead8ee-67ba-4831-b5d4-a1f128e94334\") " pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.088307 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bfc0332-bb59-42bf-bb70-462efa225c81-observability-operator-tls\") pod \"observability-operator-59bdc8b94-ljrtz\" (UID: \"3bfc0332-bb59-42bf-bb70-462efa225c81\") " pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.099064 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxlrb\" (UniqueName: \"kubernetes.io/projected/3bfc0332-bb59-42bf-bb70-462efa225c81-kube-api-access-qxlrb\") pod \"observability-operator-59bdc8b94-ljrtz\" (UID: \"3bfc0332-bb59-42bf-bb70-462efa225c81\") " pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.160426 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.181196 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-ljrtz_openshift-operators_3bfc0332-bb59-42bf-bb70-462efa225c81_0(a01d8868dd6af20cebed5d1574243b61ad97b1045cc2d34c8db961f58acf96b7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.181271 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-ljrtz_openshift-operators_3bfc0332-bb59-42bf-bb70-462efa225c81_0(a01d8868dd6af20cebed5d1574243b61ad97b1045cc2d34c8db961f58acf96b7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.181296 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-ljrtz_openshift-operators_3bfc0332-bb59-42bf-bb70-462efa225c81_0(a01d8868dd6af20cebed5d1574243b61ad97b1045cc2d34c8db961f58acf96b7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.181347 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-ljrtz_openshift-operators(3bfc0332-bb59-42bf-bb70-462efa225c81)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-ljrtz_openshift-operators(3bfc0332-bb59-42bf-bb70-462efa225c81)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-ljrtz_openshift-operators_3bfc0332-bb59-42bf-bb70-462efa225c81_0(a01d8868dd6af20cebed5d1574243b61ad97b1045cc2d34c8db961f58acf96b7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.186279 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv8gd\" (UniqueName: \"kubernetes.io/projected/79ead8ee-67ba-4831-b5d4-a1f128e94334-kube-api-access-nv8gd\") pod \"perses-operator-5bf474d74f-nkt76\" (UID: \"79ead8ee-67ba-4831-b5d4-a1f128e94334\") " pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.186386 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/79ead8ee-67ba-4831-b5d4-a1f128e94334-openshift-service-ca\") pod \"perses-operator-5bf474d74f-nkt76\" (UID: \"79ead8ee-67ba-4831-b5d4-a1f128e94334\") " pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.187553 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/79ead8ee-67ba-4831-b5d4-a1f128e94334-openshift-service-ca\") pod \"perses-operator-5bf474d74f-nkt76\" (UID: \"79ead8ee-67ba-4831-b5d4-a1f128e94334\") " pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.212366 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv8gd\" (UniqueName: \"kubernetes.io/projected/79ead8ee-67ba-4831-b5d4-a1f128e94334-kube-api-access-nv8gd\") pod \"perses-operator-5bf474d74f-nkt76\" (UID: \"79ead8ee-67ba-4831-b5d4-a1f128e94334\") " pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.339732 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.364992 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-nkt76_openshift-operators_79ead8ee-67ba-4831-b5d4-a1f128e94334_0(af5e6a9b24c7090da4c8b0b588b4ab38c4ec87cf04e8a847cf9dcce5ac8d5474): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.365075 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-nkt76_openshift-operators_79ead8ee-67ba-4831-b5d4-a1f128e94334_0(af5e6a9b24c7090da4c8b0b588b4ab38c4ec87cf04e8a847cf9dcce5ac8d5474): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.365099 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-nkt76_openshift-operators_79ead8ee-67ba-4831-b5d4-a1f128e94334_0(af5e6a9b24c7090da4c8b0b588b4ab38c4ec87cf04e8a847cf9dcce5ac8d5474): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.365151 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-nkt76_openshift-operators(79ead8ee-67ba-4831-b5d4-a1f128e94334)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-nkt76_openshift-operators(79ead8ee-67ba-4831-b5d4-a1f128e94334)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-nkt76_openshift-operators_79ead8ee-67ba-4831-b5d4-a1f128e94334_0(af5e6a9b24c7090da4c8b0b588b4ab38c4ec87cf04e8a847cf9dcce5ac8d5474): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podUID="79ead8ee-67ba-4831-b5d4-a1f128e94334" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.517549 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"2e24c39eec65888580ebf7712fdc5741633e4341d3fc40409cf2f068b2581ee1"} Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.517858 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.517873 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.517882 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.560698 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.564101 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.590940 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" podStartSLOduration=7.590921168 podStartE2EDuration="7.590921168s" podCreationTimestamp="2026-03-13 14:09:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:09:40.587018206 +0000 UTC m=+815.588606455" watchObservedRunningTime="2026-03-13 14:09:40.590921168 +0000 UTC m=+815.592509407" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.947964 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm"] Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.948261 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.948650 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.981760 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz"] Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.981926 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.982483 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.991010 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm"] Mar 13 14:09:40 crc kubenswrapper[4898]: E0313 14:09:40.991097 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators_30c06063-b926-4f2e-b8d1-8c530cc5b0a9_0(82a93402f36533955f91ec508945b8cda8200417bb6d35b063d78bd40005522b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:40 crc kubenswrapper[4898]: E0313 14:09:40.991176 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators_30c06063-b926-4f2e-b8d1-8c530cc5b0a9_0(82a93402f36533955f91ec508945b8cda8200417bb6d35b063d78bd40005522b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:40 crc kubenswrapper[4898]: E0313 14:09:40.991203 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators_30c06063-b926-4f2e-b8d1-8c530cc5b0a9_0(82a93402f36533955f91ec508945b8cda8200417bb6d35b063d78bd40005522b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:40 crc kubenswrapper[4898]: E0313 14:09:40.991255 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators(30c06063-b926-4f2e-b8d1-8c530cc5b0a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators(30c06063-b926-4f2e-b8d1-8c530cc5b0a9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators_30c06063-b926-4f2e-b8d1-8c530cc5b0a9_0(82a93402f36533955f91ec508945b8cda8200417bb6d35b063d78bd40005522b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" podUID="30c06063-b926-4f2e-b8d1-8c530cc5b0a9" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.991125 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.992076 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.995463 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-ljrtz"] Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.995553 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.995985 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:41 crc kubenswrapper[4898]: I0313 14:09:41.021810 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-nkt76"] Mar 13 14:09:41 crc kubenswrapper[4898]: I0313 14:09:41.021926 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:41 crc kubenswrapper[4898]: I0313 14:09:41.022364 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.032388 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators_951cfcfc-3a8c-410e-a3f5-f5caa10511f5_0(fb1bea47dbbc4277faf1b20a987bbf5c1da7ad0636c1822d94e5bce05e1eae4a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.032445 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators_951cfcfc-3a8c-410e-a3f5-f5caa10511f5_0(fb1bea47dbbc4277faf1b20a987bbf5c1da7ad0636c1822d94e5bce05e1eae4a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.032467 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators_951cfcfc-3a8c-410e-a3f5-f5caa10511f5_0(fb1bea47dbbc4277faf1b20a987bbf5c1da7ad0636c1822d94e5bce05e1eae4a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.032508 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators(951cfcfc-3a8c-410e-a3f5-f5caa10511f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators(951cfcfc-3a8c-410e-a3f5-f5caa10511f5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators_951cfcfc-3a8c-410e-a3f5-f5caa10511f5_0(fb1bea47dbbc4277faf1b20a987bbf5c1da7ad0636c1822d94e5bce05e1eae4a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" podUID="951cfcfc-3a8c-410e-a3f5-f5caa10511f5" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.037783 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators_8c190eee-747b-4a45-905c-fa0235080305_0(c09742bb7ffa4a1eaa8f0aae68549bee73726dfbbfdb07e2af4c746bc829cd48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.037854 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators_8c190eee-747b-4a45-905c-fa0235080305_0(c09742bb7ffa4a1eaa8f0aae68549bee73726dfbbfdb07e2af4c746bc829cd48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.037881 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators_8c190eee-747b-4a45-905c-fa0235080305_0(c09742bb7ffa4a1eaa8f0aae68549bee73726dfbbfdb07e2af4c746bc829cd48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.037951 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators(8c190eee-747b-4a45-905c-fa0235080305)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators(8c190eee-747b-4a45-905c-fa0235080305)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators_8c190eee-747b-4a45-905c-fa0235080305_0(c09742bb7ffa4a1eaa8f0aae68549bee73726dfbbfdb07e2af4c746bc829cd48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" podUID="8c190eee-747b-4a45-905c-fa0235080305" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.063716 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-ljrtz_openshift-operators_3bfc0332-bb59-42bf-bb70-462efa225c81_0(a4b948d1ba67e46bdf493134823185ec13e2283a0e8570c7b7a8bc515287a5f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.063780 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-ljrtz_openshift-operators_3bfc0332-bb59-42bf-bb70-462efa225c81_0(a4b948d1ba67e46bdf493134823185ec13e2283a0e8570c7b7a8bc515287a5f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.063804 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-ljrtz_openshift-operators_3bfc0332-bb59-42bf-bb70-462efa225c81_0(a4b948d1ba67e46bdf493134823185ec13e2283a0e8570c7b7a8bc515287a5f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.063848 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-ljrtz_openshift-operators(3bfc0332-bb59-42bf-bb70-462efa225c81)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-ljrtz_openshift-operators(3bfc0332-bb59-42bf-bb70-462efa225c81)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-ljrtz_openshift-operators_3bfc0332-bb59-42bf-bb70-462efa225c81_0(a4b948d1ba67e46bdf493134823185ec13e2283a0e8570c7b7a8bc515287a5f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.069940 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-nkt76_openshift-operators_79ead8ee-67ba-4831-b5d4-a1f128e94334_0(ab47d1d3c53b6eb35e4ba571d13de11e8ae1d29e947373707aff0bc8f5858f5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.070015 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-nkt76_openshift-operators_79ead8ee-67ba-4831-b5d4-a1f128e94334_0(ab47d1d3c53b6eb35e4ba571d13de11e8ae1d29e947373707aff0bc8f5858f5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.070042 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-nkt76_openshift-operators_79ead8ee-67ba-4831-b5d4-a1f128e94334_0(ab47d1d3c53b6eb35e4ba571d13de11e8ae1d29e947373707aff0bc8f5858f5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.070109 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-nkt76_openshift-operators(79ead8ee-67ba-4831-b5d4-a1f128e94334)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-nkt76_openshift-operators(79ead8ee-67ba-4831-b5d4-a1f128e94334)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-nkt76_openshift-operators_79ead8ee-67ba-4831-b5d4-a1f128e94334_0(ab47d1d3c53b6eb35e4ba571d13de11e8ae1d29e947373707aff0bc8f5858f5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podUID="79ead8ee-67ba-4831-b5d4-a1f128e94334" Mar 13 14:09:44 crc kubenswrapper[4898]: I0313 14:09:44.739933 4898 scope.go:117] "RemoveContainer" containerID="725f30c48676665ebc628a8b35e81161dc13d717e27cad14806022f5ad267e0e" Mar 13 14:09:45 crc kubenswrapper[4898]: I0313 14:09:45.546019 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6llfs_e521c857-9711-4f68-886f-38b233d7b05b/kube-multus/2.log" Mar 13 14:09:45 crc kubenswrapper[4898]: I0313 14:09:45.546247 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6llfs" event={"ID":"e521c857-9711-4f68-886f-38b233d7b05b","Type":"ContainerStarted","Data":"df8ecda55c092e017992e19b5f998c9f088fed5b480868d3f462891075b0153f"} Mar 13 14:09:49 crc kubenswrapper[4898]: I0313 14:09:49.134883 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:09:49 crc kubenswrapper[4898]: I0313 14:09:49.135132 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:09:51 crc kubenswrapper[4898]: I0313 14:09:51.739726 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:51 crc kubenswrapper[4898]: I0313 14:09:51.740681 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:52 crc kubenswrapper[4898]: I0313 14:09:52.236268 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz"] Mar 13 14:09:52 crc kubenswrapper[4898]: I0313 14:09:52.599760 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" event={"ID":"951cfcfc-3a8c-410e-a3f5-f5caa10511f5","Type":"ContainerStarted","Data":"23f045df4f9087c4ff8d6ed017051a37ccdf60108672e0428bb4de83d9dd4bf9"} Mar 13 14:09:52 crc kubenswrapper[4898]: I0313 14:09:52.738435 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:52 crc kubenswrapper[4898]: I0313 14:09:52.738524 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:52 crc kubenswrapper[4898]: I0313 14:09:52.738981 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:52 crc kubenswrapper[4898]: I0313 14:09:52.739312 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:53 crc kubenswrapper[4898]: I0313 14:09:53.036649 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-ljrtz"] Mar 13 14:09:53 crc kubenswrapper[4898]: W0313 14:09:53.049942 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bfc0332_bb59_42bf_bb70_462efa225c81.slice/crio-3a86f8990e921c42fb38a682be0142db0b8c0896d0e7b7d5889c117b24ba14a4 WatchSource:0}: Error finding container 3a86f8990e921c42fb38a682be0142db0b8c0896d0e7b7d5889c117b24ba14a4: Status 404 returned error can't find the container with id 3a86f8990e921c42fb38a682be0142db0b8c0896d0e7b7d5889c117b24ba14a4 Mar 13 14:09:53 crc kubenswrapper[4898]: I0313 14:09:53.053009 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:09:53 crc kubenswrapper[4898]: I0313 14:09:53.102613 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm"] Mar 13 14:09:53 crc kubenswrapper[4898]: I0313 14:09:53.656218 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" event={"ID":"3bfc0332-bb59-42bf-bb70-462efa225c81","Type":"ContainerStarted","Data":"3a86f8990e921c42fb38a682be0142db0b8c0896d0e7b7d5889c117b24ba14a4"} Mar 13 14:09:53 crc kubenswrapper[4898]: I0313 14:09:53.658077 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" event={"ID":"30c06063-b926-4f2e-b8d1-8c530cc5b0a9","Type":"ContainerStarted","Data":"04906ee9b15921d94f0048d0f4560ee05f5365765a870aca2723697b270f581b"} Mar 13 14:09:55 crc kubenswrapper[4898]: I0313 14:09:55.745101 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:55 crc kubenswrapper[4898]: I0313 14:09:55.745141 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:55 crc kubenswrapper[4898]: I0313 14:09:55.748235 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:55 crc kubenswrapper[4898]: I0313 14:09:55.748544 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.125818 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556850-n2t8z"] Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.127302 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556850-n2t8z" Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.129936 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.130025 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.133258 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556850-n2t8z"] Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.135601 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.254993 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6jtd\" (UniqueName: \"kubernetes.io/projected/a05c0334-f9cf-4640-a763-6d77b983193c-kube-api-access-r6jtd\") pod \"auto-csr-approver-29556850-n2t8z\" (UID: \"a05c0334-f9cf-4640-a763-6d77b983193c\") " pod="openshift-infra/auto-csr-approver-29556850-n2t8z" Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.355842 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6jtd\" (UniqueName: \"kubernetes.io/projected/a05c0334-f9cf-4640-a763-6d77b983193c-kube-api-access-r6jtd\") pod \"auto-csr-approver-29556850-n2t8z\" (UID: \"a05c0334-f9cf-4640-a763-6d77b983193c\") " pod="openshift-infra/auto-csr-approver-29556850-n2t8z" Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.382712 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6jtd\" (UniqueName: \"kubernetes.io/projected/a05c0334-f9cf-4640-a763-6d77b983193c-kube-api-access-r6jtd\") pod \"auto-csr-approver-29556850-n2t8z\" (UID: \"a05c0334-f9cf-4640-a763-6d77b983193c\") " pod="openshift-infra/auto-csr-approver-29556850-n2t8z" Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.450950 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556850-n2t8z" Mar 13 14:10:01 crc kubenswrapper[4898]: I0313 14:10:01.807389 4898 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 14:10:01 crc kubenswrapper[4898]: I0313 14:10:01.954474 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-nkt76"] Mar 13 14:10:01 crc kubenswrapper[4898]: W0313 14:10:01.959970 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79ead8ee_67ba_4831_b5d4_a1f128e94334.slice/crio-b7c5be9a61cc19a00ef13aa21a2c2477aeaa69117d267d31a61ade4241454c29 WatchSource:0}: Error finding container b7c5be9a61cc19a00ef13aa21a2c2477aeaa69117d267d31a61ade4241454c29: Status 404 returned error can't find the container with id b7c5be9a61cc19a00ef13aa21a2c2477aeaa69117d267d31a61ade4241454c29 Mar 13 14:10:01 crc kubenswrapper[4898]: I0313 14:10:01.997984 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm"] Mar 13 14:10:02 crc kubenswrapper[4898]: W0313 14:10:02.001235 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c190eee_747b_4a45_905c_fa0235080305.slice/crio-96f9dd42ff5b1aebdc6ef0b6db373b3200a06de09fa1013ee3f3af54a01a2ecf WatchSource:0}: Error finding container 96f9dd42ff5b1aebdc6ef0b6db373b3200a06de09fa1013ee3f3af54a01a2ecf: Status 404 returned error can't find the container with id 96f9dd42ff5b1aebdc6ef0b6db373b3200a06de09fa1013ee3f3af54a01a2ecf Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.099130 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556850-n2t8z"] Mar 13 14:10:02 crc kubenswrapper[4898]: W0313 14:10:02.107726 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda05c0334_f9cf_4640_a763_6d77b983193c.slice/crio-703960e572c96d130d5564f52c50c0edfa9bd69efc58b5c61ac8e3057d73fac2 WatchSource:0}: Error finding container 703960e572c96d130d5564f52c50c0edfa9bd69efc58b5c61ac8e3057d73fac2: Status 404 returned error can't find the container with id 703960e572c96d130d5564f52c50c0edfa9bd69efc58b5c61ac8e3057d73fac2 Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.736720 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" event={"ID":"8c190eee-747b-4a45-905c-fa0235080305","Type":"ContainerStarted","Data":"a76e2eb322540774d9f66e1ed73bc4b272395b3d03fa97bedca8cc39eff74e7a"} Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.737285 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" event={"ID":"8c190eee-747b-4a45-905c-fa0235080305","Type":"ContainerStarted","Data":"96f9dd42ff5b1aebdc6ef0b6db373b3200a06de09fa1013ee3f3af54a01a2ecf"} Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.742534 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556850-n2t8z" event={"ID":"a05c0334-f9cf-4640-a763-6d77b983193c","Type":"ContainerStarted","Data":"703960e572c96d130d5564f52c50c0edfa9bd69efc58b5c61ac8e3057d73fac2"} Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.743737 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" event={"ID":"79ead8ee-67ba-4831-b5d4-a1f128e94334","Type":"ContainerStarted","Data":"b7c5be9a61cc19a00ef13aa21a2c2477aeaa69117d267d31a61ade4241454c29"} Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.744979 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" event={"ID":"3bfc0332-bb59-42bf-bb70-462efa225c81","Type":"ContainerStarted","Data":"00f83bd17f1c6c8f7be365f8c10dd0d1fa4e2714540181da78b69f463649b54d"} Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.752327 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" event={"ID":"951cfcfc-3a8c-410e-a3f5-f5caa10511f5","Type":"ContainerStarted","Data":"db50c338f94e0187fbc033a0ecb6512586df23aa14a822181e8a355b65fc1eb4"} Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.755753 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" event={"ID":"30c06063-b926-4f2e-b8d1-8c530cc5b0a9","Type":"ContainerStarted","Data":"636c13a9a9313e60b5edc2b20690895e5af55bdbfefb1b95d3ad2e8d0edc22f7"} Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.770945 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" podStartSLOduration=24.770890918 podStartE2EDuration="24.770890918s" podCreationTimestamp="2026-03-13 14:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:10:02.760771693 +0000 UTC m=+837.762359942" watchObservedRunningTime="2026-03-13 14:10:02.770890918 +0000 UTC m=+837.772479177" Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.799622 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podStartSLOduration=16.179661151 podStartE2EDuration="24.799592188s" podCreationTimestamp="2026-03-13 14:09:38 +0000 UTC" firstStartedPulling="2026-03-13 14:09:53.052636888 +0000 UTC m=+828.054225127" lastFinishedPulling="2026-03-13 14:10:01.672567925 +0000 UTC m=+836.674156164" observedRunningTime="2026-03-13 14:10:02.786072745 +0000 UTC m=+837.787661004" watchObservedRunningTime="2026-03-13 14:10:02.799592188 +0000 UTC m=+837.801180427" Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.815696 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" podStartSLOduration=16.335436934 podStartE2EDuration="24.815680279s" podCreationTimestamp="2026-03-13 14:09:38 +0000 UTC" firstStartedPulling="2026-03-13 14:09:53.116619571 +0000 UTC m=+828.118207810" lastFinishedPulling="2026-03-13 14:10:01.596862916 +0000 UTC m=+836.598451155" observedRunningTime="2026-03-13 14:10:02.814100057 +0000 UTC m=+837.815688306" watchObservedRunningTime="2026-03-13 14:10:02.815680279 +0000 UTC m=+837.817268518" Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.847424 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" podStartSLOduration=15.500484057 podStartE2EDuration="24.847394768s" podCreationTimestamp="2026-03-13 14:09:38 +0000 UTC" firstStartedPulling="2026-03-13 14:09:52.250489899 +0000 UTC m=+827.252078138" lastFinishedPulling="2026-03-13 14:10:01.59740061 +0000 UTC m=+836.598988849" observedRunningTime="2026-03-13 14:10:02.837371386 +0000 UTC m=+837.838959645" watchObservedRunningTime="2026-03-13 14:10:02.847394768 +0000 UTC m=+837.848983007" Mar 13 14:10:03 crc kubenswrapper[4898]: I0313 14:10:03.584073 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:10:03 crc kubenswrapper[4898]: I0313 14:10:03.771043 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:10:03 crc kubenswrapper[4898]: I0313 14:10:03.772313 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:10:04 crc kubenswrapper[4898]: I0313 14:10:04.773989 4898 generic.go:334] "Generic (PLEG): container finished" podID="a05c0334-f9cf-4640-a763-6d77b983193c" containerID="b0888e4d135b3b37fbe96fe16a03f870ba37d4188b89aa723dcdb2298a0e4ed8" exitCode=0 Mar 13 14:10:04 crc kubenswrapper[4898]: I0313 14:10:04.774184 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556850-n2t8z" event={"ID":"a05c0334-f9cf-4640-a763-6d77b983193c","Type":"ContainerDied","Data":"b0888e4d135b3b37fbe96fe16a03f870ba37d4188b89aa723dcdb2298a0e4ed8"} Mar 13 14:10:05 crc kubenswrapper[4898]: I0313 14:10:05.785280 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" event={"ID":"79ead8ee-67ba-4831-b5d4-a1f128e94334","Type":"ContainerStarted","Data":"1627f2dbe0116af304a8553ca13f81e420fb301d6d5599ae9403981065700b49"} Mar 13 14:10:05 crc kubenswrapper[4898]: I0313 14:10:05.811853 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podStartSLOduration=24.572284196 podStartE2EDuration="27.811835316s" podCreationTimestamp="2026-03-13 14:09:38 +0000 UTC" firstStartedPulling="2026-03-13 14:10:01.962210527 +0000 UTC m=+836.963798766" lastFinishedPulling="2026-03-13 14:10:05.201761647 +0000 UTC m=+840.203349886" observedRunningTime="2026-03-13 14:10:05.805678975 +0000 UTC m=+840.807267214" watchObservedRunningTime="2026-03-13 14:10:05.811835316 +0000 UTC m=+840.813423555" Mar 13 14:10:06 crc kubenswrapper[4898]: I0313 14:10:06.057988 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556850-n2t8z" Mar 13 14:10:06 crc kubenswrapper[4898]: I0313 14:10:06.158332 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6jtd\" (UniqueName: \"kubernetes.io/projected/a05c0334-f9cf-4640-a763-6d77b983193c-kube-api-access-r6jtd\") pod \"a05c0334-f9cf-4640-a763-6d77b983193c\" (UID: \"a05c0334-f9cf-4640-a763-6d77b983193c\") " Mar 13 14:10:06 crc kubenswrapper[4898]: I0313 14:10:06.164078 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a05c0334-f9cf-4640-a763-6d77b983193c-kube-api-access-r6jtd" (OuterVolumeSpecName: "kube-api-access-r6jtd") pod "a05c0334-f9cf-4640-a763-6d77b983193c" (UID: "a05c0334-f9cf-4640-a763-6d77b983193c"). InnerVolumeSpecName "kube-api-access-r6jtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:10:06 crc kubenswrapper[4898]: I0313 14:10:06.260004 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6jtd\" (UniqueName: \"kubernetes.io/projected/a05c0334-f9cf-4640-a763-6d77b983193c-kube-api-access-r6jtd\") on node \"crc\" DevicePath \"\"" Mar 13 14:10:06 crc kubenswrapper[4898]: I0313 14:10:06.793620 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556850-n2t8z" event={"ID":"a05c0334-f9cf-4640-a763-6d77b983193c","Type":"ContainerDied","Data":"703960e572c96d130d5564f52c50c0edfa9bd69efc58b5c61ac8e3057d73fac2"} Mar 13 14:10:06 crc kubenswrapper[4898]: I0313 14:10:06.793684 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="703960e572c96d130d5564f52c50c0edfa9bd69efc58b5c61ac8e3057d73fac2" Mar 13 14:10:06 crc kubenswrapper[4898]: I0313 14:10:06.793729 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:10:06 crc kubenswrapper[4898]: I0313 14:10:06.793640 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556850-n2t8z" Mar 13 14:10:07 crc kubenswrapper[4898]: I0313 14:10:07.108221 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556844-q7h28"] Mar 13 14:10:07 crc kubenswrapper[4898]: I0313 14:10:07.113265 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556844-q7h28"] Mar 13 14:10:07 crc kubenswrapper[4898]: I0313 14:10:07.748770 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd30282f-65c8-45d8-89f3-c6e2f16662d4" path="/var/lib/kubelet/pods/cd30282f-65c8-45d8-89f3-c6e2f16662d4/volumes" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.823029 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-krzxz"] Mar 13 14:10:10 crc kubenswrapper[4898]: E0313 14:10:10.823844 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a05c0334-f9cf-4640-a763-6d77b983193c" containerName="oc" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.823860 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a05c0334-f9cf-4640-a763-6d77b983193c" containerName="oc" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.824040 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a05c0334-f9cf-4640-a763-6d77b983193c" containerName="oc" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.824587 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-krzxz" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.826663 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.827466 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.827474 4898 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-7fcw5" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.830387 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-krzxz"] Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.840292 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-fsdjh"] Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.841383 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fsdjh" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.845235 4898 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-ddxqr" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.855964 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cx9pw"] Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.856912 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.859646 4898 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-qxnnv" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.860671 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fsdjh"] Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.873439 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cx9pw"] Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.935431 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njm7l\" (UniqueName: \"kubernetes.io/projected/d00b7135-a080-4f0e-a23b-237ab821410f-kube-api-access-njm7l\") pod \"cert-manager-cainjector-cf98fcc89-krzxz\" (UID: \"d00b7135-a080-4f0e-a23b-237ab821410f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-krzxz" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.935505 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkl9h\" (UniqueName: \"kubernetes.io/projected/b267a865-1a03-4f37-9d2a-83380d30da1d-kube-api-access-dkl9h\") pod \"cert-manager-858654f9db-fsdjh\" (UID: \"b267a865-1a03-4f37-9d2a-83380d30da1d\") " pod="cert-manager/cert-manager-858654f9db-fsdjh" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.935749 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rj5r\" (UniqueName: \"kubernetes.io/projected/7c1fa9c0-bb2e-4806-95fd-07fba426bdc8-kube-api-access-6rj5r\") pod \"cert-manager-webhook-687f57d79b-cx9pw\" (UID: \"7c1fa9c0-bb2e-4806-95fd-07fba426bdc8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.038268 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rj5r\" (UniqueName: \"kubernetes.io/projected/7c1fa9c0-bb2e-4806-95fd-07fba426bdc8-kube-api-access-6rj5r\") pod \"cert-manager-webhook-687f57d79b-cx9pw\" (UID: \"7c1fa9c0-bb2e-4806-95fd-07fba426bdc8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.038422 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njm7l\" (UniqueName: \"kubernetes.io/projected/d00b7135-a080-4f0e-a23b-237ab821410f-kube-api-access-njm7l\") pod \"cert-manager-cainjector-cf98fcc89-krzxz\" (UID: \"d00b7135-a080-4f0e-a23b-237ab821410f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-krzxz" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.038477 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkl9h\" (UniqueName: \"kubernetes.io/projected/b267a865-1a03-4f37-9d2a-83380d30da1d-kube-api-access-dkl9h\") pod \"cert-manager-858654f9db-fsdjh\" (UID: \"b267a865-1a03-4f37-9d2a-83380d30da1d\") " pod="cert-manager/cert-manager-858654f9db-fsdjh" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.057985 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njm7l\" (UniqueName: \"kubernetes.io/projected/d00b7135-a080-4f0e-a23b-237ab821410f-kube-api-access-njm7l\") pod \"cert-manager-cainjector-cf98fcc89-krzxz\" (UID: \"d00b7135-a080-4f0e-a23b-237ab821410f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-krzxz" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.058859 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rj5r\" (UniqueName: \"kubernetes.io/projected/7c1fa9c0-bb2e-4806-95fd-07fba426bdc8-kube-api-access-6rj5r\") pod \"cert-manager-webhook-687f57d79b-cx9pw\" (UID: \"7c1fa9c0-bb2e-4806-95fd-07fba426bdc8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.065025 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkl9h\" (UniqueName: \"kubernetes.io/projected/b267a865-1a03-4f37-9d2a-83380d30da1d-kube-api-access-dkl9h\") pod \"cert-manager-858654f9db-fsdjh\" (UID: \"b267a865-1a03-4f37-9d2a-83380d30da1d\") " pod="cert-manager/cert-manager-858654f9db-fsdjh" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.141333 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-krzxz" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.156801 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fsdjh" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.169352 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.417124 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-krzxz"] Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.533878 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cx9pw"] Mar 13 14:10:11 crc kubenswrapper[4898]: W0313 14:10:11.687093 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb267a865_1a03_4f37_9d2a_83380d30da1d.slice/crio-1f1cc2f977f39cfba7be977819af837e3fdc046c2d70bb3689fff2da8ae75f53 WatchSource:0}: Error finding container 1f1cc2f977f39cfba7be977819af837e3fdc046c2d70bb3689fff2da8ae75f53: Status 404 returned error can't find the container with id 1f1cc2f977f39cfba7be977819af837e3fdc046c2d70bb3689fff2da8ae75f53 Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.691266 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fsdjh"] Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.829569 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fsdjh" event={"ID":"b267a865-1a03-4f37-9d2a-83380d30da1d","Type":"ContainerStarted","Data":"1f1cc2f977f39cfba7be977819af837e3fdc046c2d70bb3689fff2da8ae75f53"} Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.830707 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" event={"ID":"7c1fa9c0-bb2e-4806-95fd-07fba426bdc8","Type":"ContainerStarted","Data":"8585a6e041128cc6364696bdcc7b704ae49c2d814eb51c69f1bee9d7d2485fa0"} Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.831762 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-krzxz" event={"ID":"d00b7135-a080-4f0e-a23b-237ab821410f","Type":"ContainerStarted","Data":"1ccd14bd98922ce946bdcfd118dac94fa47d6c35a30a343750095db805780f83"} Mar 13 14:10:15 crc kubenswrapper[4898]: I0313 14:10:15.871713 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-krzxz" event={"ID":"d00b7135-a080-4f0e-a23b-237ab821410f","Type":"ContainerStarted","Data":"ae5facfb37b80f172262f70a7db74c6be7337536065ad2951208a9ec507e08f9"} Mar 13 14:10:15 crc kubenswrapper[4898]: I0313 14:10:15.873023 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fsdjh" event={"ID":"b267a865-1a03-4f37-9d2a-83380d30da1d","Type":"ContainerStarted","Data":"bdb365e9cc950e99a6090096237536f256ed40a6c8240745e58c48ae8456e404"} Mar 13 14:10:15 crc kubenswrapper[4898]: I0313 14:10:15.874698 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" event={"ID":"7c1fa9c0-bb2e-4806-95fd-07fba426bdc8","Type":"ContainerStarted","Data":"cd3ee2c415f806671eb3ae90112b21e024a77788831d4c74c6503a512654d1e6"} Mar 13 14:10:15 crc kubenswrapper[4898]: I0313 14:10:15.874839 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 14:10:15 crc kubenswrapper[4898]: I0313 14:10:15.909524 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" podStartSLOduration=2.310198539 podStartE2EDuration="5.909494223s" podCreationTimestamp="2026-03-13 14:10:10 +0000 UTC" firstStartedPulling="2026-03-13 14:10:11.546091143 +0000 UTC m=+846.547679392" lastFinishedPulling="2026-03-13 14:10:15.145386827 +0000 UTC m=+850.146975076" observedRunningTime="2026-03-13 14:10:15.907215304 +0000 UTC m=+850.908803563" watchObservedRunningTime="2026-03-13 14:10:15.909494223 +0000 UTC m=+850.911082462" Mar 13 14:10:15 crc kubenswrapper[4898]: I0313 14:10:15.913812 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-krzxz" podStartSLOduration=2.921365227 podStartE2EDuration="5.913797736s" podCreationTimestamp="2026-03-13 14:10:10 +0000 UTC" firstStartedPulling="2026-03-13 14:10:11.431264201 +0000 UTC m=+846.432852440" lastFinishedPulling="2026-03-13 14:10:14.4236967 +0000 UTC m=+849.425284949" observedRunningTime="2026-03-13 14:10:15.886302867 +0000 UTC m=+850.887891126" watchObservedRunningTime="2026-03-13 14:10:15.913797736 +0000 UTC m=+850.915385975" Mar 13 14:10:15 crc kubenswrapper[4898]: I0313 14:10:15.937290 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-fsdjh" podStartSLOduration=2.400737166 podStartE2EDuration="5.937273119s" podCreationTimestamp="2026-03-13 14:10:10 +0000 UTC" firstStartedPulling="2026-03-13 14:10:11.689299477 +0000 UTC m=+846.690887716" lastFinishedPulling="2026-03-13 14:10:15.22583543 +0000 UTC m=+850.227423669" observedRunningTime="2026-03-13 14:10:15.935652907 +0000 UTC m=+850.937241146" watchObservedRunningTime="2026-03-13 14:10:15.937273119 +0000 UTC m=+850.938861358" Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.134565 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.134881 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.134939 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.135603 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5a348cbe99f8e01e53545f65e722853afafc6c3cafe54ec4136fd0f288299e87"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.135673 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://5a348cbe99f8e01e53545f65e722853afafc6c3cafe54ec4136fd0f288299e87" gracePeriod=600 Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.342750 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.920918 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="5a348cbe99f8e01e53545f65e722853afafc6c3cafe54ec4136fd0f288299e87" exitCode=0 Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.920968 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"5a348cbe99f8e01e53545f65e722853afafc6c3cafe54ec4136fd0f288299e87"} Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.921000 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"b58828da596890620679d1e69bfdfd0b7cd0cf06254eed4031e215964351d8c6"} Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.921024 4898 scope.go:117] "RemoveContainer" containerID="87afe240e3b86dba51997a01c599db519fabde9560e41dee3b537bab350f3092" Mar 13 14:10:21 crc kubenswrapper[4898]: I0313 14:10:21.171164 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 14:10:32 crc kubenswrapper[4898]: I0313 14:10:32.382302 4898 scope.go:117] "RemoveContainer" containerID="59d89d033eeab55992b3ec00208c3b5cec577e8de3d6a36471e4a08df49334b0" Mar 13 14:10:42 crc kubenswrapper[4898]: I0313 14:10:42.842501 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls"] Mar 13 14:10:42 crc kubenswrapper[4898]: I0313 14:10:42.844634 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:42 crc kubenswrapper[4898]: I0313 14:10:42.848651 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 14:10:42 crc kubenswrapper[4898]: I0313 14:10:42.856007 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls"] Mar 13 14:10:42 crc kubenswrapper[4898]: I0313 14:10:42.900923 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:42 crc kubenswrapper[4898]: I0313 14:10:42.901203 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:42 crc kubenswrapper[4898]: I0313 14:10:42.901234 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkhfg\" (UniqueName: \"kubernetes.io/projected/964a321b-4be6-444e-8c20-3fc586008da7-kube-api-access-vkhfg\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.002085 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.002154 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.002189 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkhfg\" (UniqueName: \"kubernetes.io/projected/964a321b-4be6-444e-8c20-3fc586008da7-kube-api-access-vkhfg\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.002620 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.002738 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.021553 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkhfg\" (UniqueName: \"kubernetes.io/projected/964a321b-4be6-444e-8c20-3fc586008da7-kube-api-access-vkhfg\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.170248 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.252703 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q"] Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.254189 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.263413 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q"] Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.307013 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kmk7\" (UniqueName: \"kubernetes.io/projected/e0f45cf5-8d8f-472b-87f5-64e5c8192622-kube-api-access-5kmk7\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.307332 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.307524 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.408587 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.408673 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kmk7\" (UniqueName: \"kubernetes.io/projected/e0f45cf5-8d8f-472b-87f5-64e5c8192622-kube-api-access-5kmk7\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.408712 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.409148 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.409173 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.426595 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kmk7\" (UniqueName: \"kubernetes.io/projected/e0f45cf5-8d8f-472b-87f5-64e5c8192622-kube-api-access-5kmk7\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.577587 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.600622 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls"] Mar 13 14:10:43 crc kubenswrapper[4898]: W0313 14:10:43.603965 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod964a321b_4be6_444e_8c20_3fc586008da7.slice/crio-7e17dec2ad4d7267ad062316a6b30662c69ae0cc24b25bd895d7c515a6ce4cd7 WatchSource:0}: Error finding container 7e17dec2ad4d7267ad062316a6b30662c69ae0cc24b25bd895d7c515a6ce4cd7: Status 404 returned error can't find the container with id 7e17dec2ad4d7267ad062316a6b30662c69ae0cc24b25bd895d7c515a6ce4cd7 Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.801175 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q"] Mar 13 14:10:44 crc kubenswrapper[4898]: I0313 14:10:44.111593 4898 generic.go:334] "Generic (PLEG): container finished" podID="964a321b-4be6-444e-8c20-3fc586008da7" containerID="f40a97a6c9bc482e22eb5f7a44f7d05df46612ccdb769024461402ccaac5bfc7" exitCode=0 Mar 13 14:10:44 crc kubenswrapper[4898]: I0313 14:10:44.111694 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" event={"ID":"964a321b-4be6-444e-8c20-3fc586008da7","Type":"ContainerDied","Data":"f40a97a6c9bc482e22eb5f7a44f7d05df46612ccdb769024461402ccaac5bfc7"} Mar 13 14:10:44 crc kubenswrapper[4898]: I0313 14:10:44.111729 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" event={"ID":"964a321b-4be6-444e-8c20-3fc586008da7","Type":"ContainerStarted","Data":"7e17dec2ad4d7267ad062316a6b30662c69ae0cc24b25bd895d7c515a6ce4cd7"} Mar 13 14:10:44 crc kubenswrapper[4898]: I0313 14:10:44.116214 4898 generic.go:334] "Generic (PLEG): container finished" podID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerID="65369742e289128b731e475df8f6329c63c8ba84f236a7ee566b63b276731bf3" exitCode=0 Mar 13 14:10:44 crc kubenswrapper[4898]: I0313 14:10:44.116270 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" event={"ID":"e0f45cf5-8d8f-472b-87f5-64e5c8192622","Type":"ContainerDied","Data":"65369742e289128b731e475df8f6329c63c8ba84f236a7ee566b63b276731bf3"} Mar 13 14:10:44 crc kubenswrapper[4898]: I0313 14:10:44.116306 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" event={"ID":"e0f45cf5-8d8f-472b-87f5-64e5c8192622","Type":"ContainerStarted","Data":"5645baba009813ce27646233637e36e5d24b1130e7a1c6181af39f5864ede086"} Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.132268 4898 generic.go:334] "Generic (PLEG): container finished" podID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerID="36862814fe1f0cff6a24426793b563f39aa7810a1ced2cae484a35cfc03c21ba" exitCode=0 Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.132826 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" event={"ID":"e0f45cf5-8d8f-472b-87f5-64e5c8192622","Type":"ContainerDied","Data":"36862814fe1f0cff6a24426793b563f39aa7810a1ced2cae484a35cfc03c21ba"} Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.137176 4898 generic.go:334] "Generic (PLEG): container finished" podID="964a321b-4be6-444e-8c20-3fc586008da7" containerID="2be8f772924496c9e9df01ab97a8584fca928e05b67506e23db047d5b801b2d5" exitCode=0 Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.137238 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" event={"ID":"964a321b-4be6-444e-8c20-3fc586008da7","Type":"ContainerDied","Data":"2be8f772924496c9e9df01ab97a8584fca928e05b67506e23db047d5b801b2d5"} Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.598736 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-22tbp"] Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.601124 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.610395 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-22tbp"] Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.655934 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-utilities\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.656352 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl22c\" (UniqueName: \"kubernetes.io/projected/5de1b020-889c-4fb0-b067-cdeb543f0b64-kube-api-access-tl22c\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.656511 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-catalog-content\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.757666 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl22c\" (UniqueName: \"kubernetes.io/projected/5de1b020-889c-4fb0-b067-cdeb543f0b64-kube-api-access-tl22c\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.757744 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-catalog-content\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.757799 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-utilities\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.758231 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-utilities\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.758416 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-catalog-content\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.779463 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl22c\" (UniqueName: \"kubernetes.io/projected/5de1b020-889c-4fb0-b067-cdeb543f0b64-kube-api-access-tl22c\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:47 crc kubenswrapper[4898]: I0313 14:10:47.003178 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:47 crc kubenswrapper[4898]: I0313 14:10:47.145304 4898 generic.go:334] "Generic (PLEG): container finished" podID="964a321b-4be6-444e-8c20-3fc586008da7" containerID="46895da33872cbf4500006c61e182af881c3f4ab68348bff7bd9f862e9008de2" exitCode=0 Mar 13 14:10:47 crc kubenswrapper[4898]: I0313 14:10:47.145369 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" event={"ID":"964a321b-4be6-444e-8c20-3fc586008da7","Type":"ContainerDied","Data":"46895da33872cbf4500006c61e182af881c3f4ab68348bff7bd9f862e9008de2"} Mar 13 14:10:47 crc kubenswrapper[4898]: I0313 14:10:47.148941 4898 generic.go:334] "Generic (PLEG): container finished" podID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerID="8afda24d2b140242bb216f07c14a1b8110a2b19a062c69dd7a28423dd8517e0f" exitCode=0 Mar 13 14:10:47 crc kubenswrapper[4898]: I0313 14:10:47.149180 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" event={"ID":"e0f45cf5-8d8f-472b-87f5-64e5c8192622","Type":"ContainerDied","Data":"8afda24d2b140242bb216f07c14a1b8110a2b19a062c69dd7a28423dd8517e0f"} Mar 13 14:10:47 crc kubenswrapper[4898]: I0313 14:10:47.276837 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-22tbp"] Mar 13 14:10:47 crc kubenswrapper[4898]: W0313 14:10:47.281495 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5de1b020_889c_4fb0_b067_cdeb543f0b64.slice/crio-249664907f9e2e0549b46b4ce9e51616428d129694559809cf7f3f144f337489 WatchSource:0}: Error finding container 249664907f9e2e0549b46b4ce9e51616428d129694559809cf7f3f144f337489: Status 404 returned error can't find the container with id 249664907f9e2e0549b46b4ce9e51616428d129694559809cf7f3f144f337489 Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.155676 4898 generic.go:334] "Generic (PLEG): container finished" podID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerID="ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead" exitCode=0 Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.156650 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22tbp" event={"ID":"5de1b020-889c-4fb0-b067-cdeb543f0b64","Type":"ContainerDied","Data":"ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead"} Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.156673 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22tbp" event={"ID":"5de1b020-889c-4fb0-b067-cdeb543f0b64","Type":"ContainerStarted","Data":"249664907f9e2e0549b46b4ce9e51616428d129694559809cf7f3f144f337489"} Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.467576 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.472733 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.486186 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-util\") pod \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.486243 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-util\") pod \"964a321b-4be6-444e-8c20-3fc586008da7\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.486289 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kmk7\" (UniqueName: \"kubernetes.io/projected/e0f45cf5-8d8f-472b-87f5-64e5c8192622-kube-api-access-5kmk7\") pod \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.486355 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-bundle\") pod \"964a321b-4be6-444e-8c20-3fc586008da7\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.486387 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkhfg\" (UniqueName: \"kubernetes.io/projected/964a321b-4be6-444e-8c20-3fc586008da7-kube-api-access-vkhfg\") pod \"964a321b-4be6-444e-8c20-3fc586008da7\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.486409 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-bundle\") pod \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.487337 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-bundle" (OuterVolumeSpecName: "bundle") pod "964a321b-4be6-444e-8c20-3fc586008da7" (UID: "964a321b-4be6-444e-8c20-3fc586008da7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.488558 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.489215 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-bundle" (OuterVolumeSpecName: "bundle") pod "e0f45cf5-8d8f-472b-87f5-64e5c8192622" (UID: "e0f45cf5-8d8f-472b-87f5-64e5c8192622"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.495359 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f45cf5-8d8f-472b-87f5-64e5c8192622-kube-api-access-5kmk7" (OuterVolumeSpecName: "kube-api-access-5kmk7") pod "e0f45cf5-8d8f-472b-87f5-64e5c8192622" (UID: "e0f45cf5-8d8f-472b-87f5-64e5c8192622"). InnerVolumeSpecName "kube-api-access-5kmk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.502445 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-util" (OuterVolumeSpecName: "util") pod "964a321b-4be6-444e-8c20-3fc586008da7" (UID: "964a321b-4be6-444e-8c20-3fc586008da7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.506265 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-util" (OuterVolumeSpecName: "util") pod "e0f45cf5-8d8f-472b-87f5-64e5c8192622" (UID: "e0f45cf5-8d8f-472b-87f5-64e5c8192622"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.542163 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/964a321b-4be6-444e-8c20-3fc586008da7-kube-api-access-vkhfg" (OuterVolumeSpecName: "kube-api-access-vkhfg") pod "964a321b-4be6-444e-8c20-3fc586008da7" (UID: "964a321b-4be6-444e-8c20-3fc586008da7"). InnerVolumeSpecName "kube-api-access-vkhfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.590433 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkhfg\" (UniqueName: \"kubernetes.io/projected/964a321b-4be6-444e-8c20-3fc586008da7-kube-api-access-vkhfg\") on node \"crc\" DevicePath \"\"" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.590680 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.590743 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-util\") on node \"crc\" DevicePath \"\"" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.590801 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-util\") on node \"crc\" DevicePath \"\"" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.590857 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kmk7\" (UniqueName: \"kubernetes.io/projected/e0f45cf5-8d8f-472b-87f5-64e5c8192622-kube-api-access-5kmk7\") on node \"crc\" DevicePath \"\"" Mar 13 14:10:49 crc kubenswrapper[4898]: I0313 14:10:49.165972 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:49 crc kubenswrapper[4898]: I0313 14:10:49.165993 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" event={"ID":"964a321b-4be6-444e-8c20-3fc586008da7","Type":"ContainerDied","Data":"7e17dec2ad4d7267ad062316a6b30662c69ae0cc24b25bd895d7c515a6ce4cd7"} Mar 13 14:10:49 crc kubenswrapper[4898]: I0313 14:10:49.166046 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e17dec2ad4d7267ad062316a6b30662c69ae0cc24b25bd895d7c515a6ce4cd7" Mar 13 14:10:49 crc kubenswrapper[4898]: I0313 14:10:49.168097 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" event={"ID":"e0f45cf5-8d8f-472b-87f5-64e5c8192622","Type":"ContainerDied","Data":"5645baba009813ce27646233637e36e5d24b1130e7a1c6181af39f5864ede086"} Mar 13 14:10:49 crc kubenswrapper[4898]: I0313 14:10:49.168131 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5645baba009813ce27646233637e36e5d24b1130e7a1c6181af39f5864ede086" Mar 13 14:10:49 crc kubenswrapper[4898]: I0313 14:10:49.168211 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:50 crc kubenswrapper[4898]: I0313 14:10:50.175834 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22tbp" event={"ID":"5de1b020-889c-4fb0-b067-cdeb543f0b64","Type":"ContainerStarted","Data":"8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7"} Mar 13 14:10:51 crc kubenswrapper[4898]: I0313 14:10:51.197282 4898 generic.go:334] "Generic (PLEG): container finished" podID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerID="8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7" exitCode=0 Mar 13 14:10:51 crc kubenswrapper[4898]: I0313 14:10:51.197350 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22tbp" event={"ID":"5de1b020-889c-4fb0-b067-cdeb543f0b64","Type":"ContainerDied","Data":"8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7"} Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.101755 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-bcn59"] Mar 13 14:10:53 crc kubenswrapper[4898]: E0313 14:10:53.102602 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964a321b-4be6-444e-8c20-3fc586008da7" containerName="util" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.102618 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="964a321b-4be6-444e-8c20-3fc586008da7" containerName="util" Mar 13 14:10:53 crc kubenswrapper[4898]: E0313 14:10:53.102631 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964a321b-4be6-444e-8c20-3fc586008da7" containerName="extract" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.102639 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="964a321b-4be6-444e-8c20-3fc586008da7" containerName="extract" Mar 13 14:10:53 crc kubenswrapper[4898]: E0313 14:10:53.102653 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerName="extract" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.102661 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerName="extract" Mar 13 14:10:53 crc kubenswrapper[4898]: E0313 14:10:53.102678 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964a321b-4be6-444e-8c20-3fc586008da7" containerName="pull" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.102685 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="964a321b-4be6-444e-8c20-3fc586008da7" containerName="pull" Mar 13 14:10:53 crc kubenswrapper[4898]: E0313 14:10:53.102695 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerName="util" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.102703 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerName="util" Mar 13 14:10:53 crc kubenswrapper[4898]: E0313 14:10:53.102716 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerName="pull" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.102725 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerName="pull" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.102875 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="964a321b-4be6-444e-8c20-3fc586008da7" containerName="extract" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.102913 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerName="extract" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.103460 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-bcn59" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.105177 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-t77xd" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.105785 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.107607 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.122275 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-bcn59"] Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.160551 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-559mq\" (UniqueName: \"kubernetes.io/projected/c5cfd1be-ede5-4678-99c5-17f232b97d81-kube-api-access-559mq\") pod \"cluster-logging-operator-66689c4bbf-bcn59\" (UID: \"c5cfd1be-ede5-4678-99c5-17f232b97d81\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-bcn59" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.214884 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22tbp" event={"ID":"5de1b020-889c-4fb0-b067-cdeb543f0b64","Type":"ContainerStarted","Data":"4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4"} Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.246262 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-22tbp" podStartSLOduration=3.443707631 podStartE2EDuration="7.246239269s" podCreationTimestamp="2026-03-13 14:10:46 +0000 UTC" firstStartedPulling="2026-03-13 14:10:48.157726282 +0000 UTC m=+883.159314521" lastFinishedPulling="2026-03-13 14:10:51.96025791 +0000 UTC m=+886.961846159" observedRunningTime="2026-03-13 14:10:53.241828564 +0000 UTC m=+888.243416823" watchObservedRunningTime="2026-03-13 14:10:53.246239269 +0000 UTC m=+888.247827508" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.261328 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-559mq\" (UniqueName: \"kubernetes.io/projected/c5cfd1be-ede5-4678-99c5-17f232b97d81-kube-api-access-559mq\") pod \"cluster-logging-operator-66689c4bbf-bcn59\" (UID: \"c5cfd1be-ede5-4678-99c5-17f232b97d81\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-bcn59" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.279427 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-559mq\" (UniqueName: \"kubernetes.io/projected/c5cfd1be-ede5-4678-99c5-17f232b97d81-kube-api-access-559mq\") pod \"cluster-logging-operator-66689c4bbf-bcn59\" (UID: \"c5cfd1be-ede5-4678-99c5-17f232b97d81\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-bcn59" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.426726 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-bcn59" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.830499 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-bcn59"] Mar 13 14:10:53 crc kubenswrapper[4898]: W0313 14:10:53.835490 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5cfd1be_ede5_4678_99c5_17f232b97d81.slice/crio-e11b7a347d9bfe96231e8aa73f69aa352ae98df446aa55c531163226ed42dfca WatchSource:0}: Error finding container e11b7a347d9bfe96231e8aa73f69aa352ae98df446aa55c531163226ed42dfca: Status 404 returned error can't find the container with id e11b7a347d9bfe96231e8aa73f69aa352ae98df446aa55c531163226ed42dfca Mar 13 14:10:54 crc kubenswrapper[4898]: I0313 14:10:54.221627 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-bcn59" event={"ID":"c5cfd1be-ede5-4678-99c5-17f232b97d81","Type":"ContainerStarted","Data":"e11b7a347d9bfe96231e8aa73f69aa352ae98df446aa55c531163226ed42dfca"} Mar 13 14:10:57 crc kubenswrapper[4898]: I0313 14:10:57.003527 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:57 crc kubenswrapper[4898]: I0313 14:10:57.004036 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:58 crc kubenswrapper[4898]: I0313 14:10:58.047165 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-22tbp" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="registry-server" probeResult="failure" output=< Mar 13 14:10:58 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:10:58 crc kubenswrapper[4898]: > Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.788147 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8"] Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.815287 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.820304 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-45rjv" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.820485 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.820588 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.820748 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.820856 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.820951 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.827680 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8"] Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.927234 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2cd05b5b-32da-4560-a761-72221b99e2c6-manager-config\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.927304 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-apiservice-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.927335 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvklx\" (UniqueName: \"kubernetes.io/projected/2cd05b5b-32da-4560-a761-72221b99e2c6-kube-api-access-vvklx\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.927377 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-webhook-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.927422 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.028586 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2cd05b5b-32da-4560-a761-72221b99e2c6-manager-config\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.028649 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-apiservice-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.028679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvklx\" (UniqueName: \"kubernetes.io/projected/2cd05b5b-32da-4560-a761-72221b99e2c6-kube-api-access-vvklx\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.028727 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-webhook-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.028772 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.029849 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2cd05b5b-32da-4560-a761-72221b99e2c6-manager-config\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.035161 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.035962 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-webhook-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.036415 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-apiservice-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.061547 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvklx\" (UniqueName: \"kubernetes.io/projected/2cd05b5b-32da-4560-a761-72221b99e2c6-kube-api-access-vvklx\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.133790 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.592655 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8"] Mar 13 14:11:03 crc kubenswrapper[4898]: W0313 14:11:03.612206 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cd05b5b_32da_4560_a761_72221b99e2c6.slice/crio-6b2feeaff613f0c9ee548ef7838db98cfa828fbf888348fd720a846c579b54e0 WatchSource:0}: Error finding container 6b2feeaff613f0c9ee548ef7838db98cfa828fbf888348fd720a846c579b54e0: Status 404 returned error can't find the container with id 6b2feeaff613f0c9ee548ef7838db98cfa828fbf888348fd720a846c579b54e0 Mar 13 14:11:04 crc kubenswrapper[4898]: I0313 14:11:04.300273 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" event={"ID":"2cd05b5b-32da-4560-a761-72221b99e2c6","Type":"ContainerStarted","Data":"6b2feeaff613f0c9ee548ef7838db98cfa828fbf888348fd720a846c579b54e0"} Mar 13 14:11:07 crc kubenswrapper[4898]: I0313 14:11:07.053950 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:11:07 crc kubenswrapper[4898]: I0313 14:11:07.099813 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:11:09 crc kubenswrapper[4898]: I0313 14:11:09.386821 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-22tbp"] Mar 13 14:11:09 crc kubenswrapper[4898]: I0313 14:11:09.387293 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-22tbp" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="registry-server" containerID="cri-o://4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4" gracePeriod=2 Mar 13 14:11:09 crc kubenswrapper[4898]: I0313 14:11:09.855008 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:11:09 crc kubenswrapper[4898]: I0313 14:11:09.931760 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-utilities\") pod \"5de1b020-889c-4fb0-b067-cdeb543f0b64\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " Mar 13 14:11:09 crc kubenswrapper[4898]: I0313 14:11:09.931941 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl22c\" (UniqueName: \"kubernetes.io/projected/5de1b020-889c-4fb0-b067-cdeb543f0b64-kube-api-access-tl22c\") pod \"5de1b020-889c-4fb0-b067-cdeb543f0b64\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " Mar 13 14:11:09 crc kubenswrapper[4898]: I0313 14:11:09.932015 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-catalog-content\") pod \"5de1b020-889c-4fb0-b067-cdeb543f0b64\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " Mar 13 14:11:09 crc kubenswrapper[4898]: I0313 14:11:09.932718 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-utilities" (OuterVolumeSpecName: "utilities") pod "5de1b020-889c-4fb0-b067-cdeb543f0b64" (UID: "5de1b020-889c-4fb0-b067-cdeb543f0b64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:11:09 crc kubenswrapper[4898]: I0313 14:11:09.945754 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5de1b020-889c-4fb0-b067-cdeb543f0b64-kube-api-access-tl22c" (OuterVolumeSpecName: "kube-api-access-tl22c") pod "5de1b020-889c-4fb0-b067-cdeb543f0b64" (UID: "5de1b020-889c-4fb0-b067-cdeb543f0b64"). InnerVolumeSpecName "kube-api-access-tl22c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.034160 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl22c\" (UniqueName: \"kubernetes.io/projected/5de1b020-889c-4fb0-b067-cdeb543f0b64-kube-api-access-tl22c\") on node \"crc\" DevicePath \"\"" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.034198 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.089747 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5de1b020-889c-4fb0-b067-cdeb543f0b64" (UID: "5de1b020-889c-4fb0-b067-cdeb543f0b64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.134932 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.351445 4898 generic.go:334] "Generic (PLEG): container finished" podID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerID="4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4" exitCode=0 Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.351523 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.351510 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22tbp" event={"ID":"5de1b020-889c-4fb0-b067-cdeb543f0b64","Type":"ContainerDied","Data":"4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4"} Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.351961 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22tbp" event={"ID":"5de1b020-889c-4fb0-b067-cdeb543f0b64","Type":"ContainerDied","Data":"249664907f9e2e0549b46b4ce9e51616428d129694559809cf7f3f144f337489"} Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.351988 4898 scope.go:117] "RemoveContainer" containerID="4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.358472 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-bcn59" event={"ID":"c5cfd1be-ede5-4678-99c5-17f232b97d81","Type":"ContainerStarted","Data":"1c9aab7f14f3dfc6bc9ae40ad727b38c16bd058db7a4f49a41298a530cfaec69"} Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.390010 4898 scope.go:117] "RemoveContainer" containerID="8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.418785 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-66689c4bbf-bcn59" podStartSLOduration=1.766979964 podStartE2EDuration="17.418764282s" podCreationTimestamp="2026-03-13 14:10:53 +0000 UTC" firstStartedPulling="2026-03-13 14:10:53.839092447 +0000 UTC m=+888.840680686" lastFinishedPulling="2026-03-13 14:11:09.490876765 +0000 UTC m=+904.492465004" observedRunningTime="2026-03-13 14:11:10.396333796 +0000 UTC m=+905.397922045" watchObservedRunningTime="2026-03-13 14:11:10.418764282 +0000 UTC m=+905.420352541" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.422256 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-22tbp"] Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.440715 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-22tbp"] Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.453218 4898 scope.go:117] "RemoveContainer" containerID="ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.507182 4898 scope.go:117] "RemoveContainer" containerID="4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4" Mar 13 14:11:10 crc kubenswrapper[4898]: E0313 14:11:10.510806 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4\": container with ID starting with 4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4 not found: ID does not exist" containerID="4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.510846 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4"} err="failed to get container status \"4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4\": rpc error: code = NotFound desc = could not find container \"4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4\": container with ID starting with 4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4 not found: ID does not exist" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.510867 4898 scope.go:117] "RemoveContainer" containerID="8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7" Mar 13 14:11:10 crc kubenswrapper[4898]: E0313 14:11:10.511666 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7\": container with ID starting with 8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7 not found: ID does not exist" containerID="8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.511713 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7"} err="failed to get container status \"8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7\": rpc error: code = NotFound desc = could not find container \"8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7\": container with ID starting with 8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7 not found: ID does not exist" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.511758 4898 scope.go:117] "RemoveContainer" containerID="ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead" Mar 13 14:11:10 crc kubenswrapper[4898]: E0313 14:11:10.513847 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead\": container with ID starting with ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead not found: ID does not exist" containerID="ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.513875 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead"} err="failed to get container status \"ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead\": rpc error: code = NotFound desc = could not find container \"ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead\": container with ID starting with ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead not found: ID does not exist" Mar 13 14:11:11 crc kubenswrapper[4898]: I0313 14:11:11.746697 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" path="/var/lib/kubelet/pods/5de1b020-889c-4fb0-b067-cdeb543f0b64/volumes" Mar 13 14:11:13 crc kubenswrapper[4898]: I0313 14:11:13.380592 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" event={"ID":"2cd05b5b-32da-4560-a761-72221b99e2c6","Type":"ContainerStarted","Data":"a617f169f1dfdd2506afa41a7275d981d2a519aa1e27fbcd8c4664f8adc494fa"} Mar 13 14:11:21 crc kubenswrapper[4898]: I0313 14:11:21.437075 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" event={"ID":"2cd05b5b-32da-4560-a761-72221b99e2c6","Type":"ContainerStarted","Data":"f4b6459bd7650966303989bbf78aca39fcd321d5efb09073863a0f765a6e9309"} Mar 13 14:11:21 crc kubenswrapper[4898]: I0313 14:11:21.439292 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:21 crc kubenswrapper[4898]: I0313 14:11:21.441826 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:21 crc kubenswrapper[4898]: I0313 14:11:21.495022 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" podStartSLOduration=2.413564004 podStartE2EDuration="19.49500316s" podCreationTimestamp="2026-03-13 14:11:02 +0000 UTC" firstStartedPulling="2026-03-13 14:11:03.616515314 +0000 UTC m=+898.618103553" lastFinishedPulling="2026-03-13 14:11:20.69795445 +0000 UTC m=+915.699542709" observedRunningTime="2026-03-13 14:11:21.483248547 +0000 UTC m=+916.484836796" watchObservedRunningTime="2026-03-13 14:11:21.49500316 +0000 UTC m=+916.496591399" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.315953 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Mar 13 14:11:26 crc kubenswrapper[4898]: E0313 14:11:26.317071 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="registry-server" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.317143 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="registry-server" Mar 13 14:11:26 crc kubenswrapper[4898]: E0313 14:11:26.317207 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="extract-utilities" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.317218 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="extract-utilities" Mar 13 14:11:26 crc kubenswrapper[4898]: E0313 14:11:26.317237 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="extract-content" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.317248 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="extract-content" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.317590 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="registry-server" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.319456 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.322378 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.322766 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.327986 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.494740 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d0d82100-9843-444c-8945-e270e6e1491b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d82100-9843-444c-8945-e270e6e1491b\") pod \"minio\" (UID: \"9ce14c7d-1e04-483e-9738-ae3c512ef76f\") " pod="minio-dev/minio" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.494894 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gt9v\" (UniqueName: \"kubernetes.io/projected/9ce14c7d-1e04-483e-9738-ae3c512ef76f-kube-api-access-6gt9v\") pod \"minio\" (UID: \"9ce14c7d-1e04-483e-9738-ae3c512ef76f\") " pod="minio-dev/minio" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.596748 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d0d82100-9843-444c-8945-e270e6e1491b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d82100-9843-444c-8945-e270e6e1491b\") pod \"minio\" (UID: \"9ce14c7d-1e04-483e-9738-ae3c512ef76f\") " pod="minio-dev/minio" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.596851 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gt9v\" (UniqueName: \"kubernetes.io/projected/9ce14c7d-1e04-483e-9738-ae3c512ef76f-kube-api-access-6gt9v\") pod \"minio\" (UID: \"9ce14c7d-1e04-483e-9738-ae3c512ef76f\") " pod="minio-dev/minio" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.601809 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.601879 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d0d82100-9843-444c-8945-e270e6e1491b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d82100-9843-444c-8945-e270e6e1491b\") pod \"minio\" (UID: \"9ce14c7d-1e04-483e-9738-ae3c512ef76f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/14e71af032d1929f261b00fb9b63ff890d85884402f51a4b1beb60df0fe69582/globalmount\"" pod="minio-dev/minio" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.634645 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gt9v\" (UniqueName: \"kubernetes.io/projected/9ce14c7d-1e04-483e-9738-ae3c512ef76f-kube-api-access-6gt9v\") pod \"minio\" (UID: \"9ce14c7d-1e04-483e-9738-ae3c512ef76f\") " pod="minio-dev/minio" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.648126 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d0d82100-9843-444c-8945-e270e6e1491b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d82100-9843-444c-8945-e270e6e1491b\") pod \"minio\" (UID: \"9ce14c7d-1e04-483e-9738-ae3c512ef76f\") " pod="minio-dev/minio" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.939323 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 13 14:11:27 crc kubenswrapper[4898]: I0313 14:11:27.177879 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 13 14:11:27 crc kubenswrapper[4898]: W0313 14:11:27.182099 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ce14c7d_1e04_483e_9738_ae3c512ef76f.slice/crio-103eef127bf515cf8543f30dfb93161b72d04fe01b3826bceca3334bcde0096c WatchSource:0}: Error finding container 103eef127bf515cf8543f30dfb93161b72d04fe01b3826bceca3334bcde0096c: Status 404 returned error can't find the container with id 103eef127bf515cf8543f30dfb93161b72d04fe01b3826bceca3334bcde0096c Mar 13 14:11:27 crc kubenswrapper[4898]: I0313 14:11:27.482340 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"9ce14c7d-1e04-483e-9738-ae3c512ef76f","Type":"ContainerStarted","Data":"103eef127bf515cf8543f30dfb93161b72d04fe01b3826bceca3334bcde0096c"} Mar 13 14:11:30 crc kubenswrapper[4898]: I0313 14:11:30.507784 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"9ce14c7d-1e04-483e-9738-ae3c512ef76f","Type":"ContainerStarted","Data":"0e85cb6a69d039542f475b1ae85442734bc5fe33fa28ac8903dbcf3ab518a878"} Mar 13 14:11:30 crc kubenswrapper[4898]: I0313 14:11:30.528518 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=3.461573106 podStartE2EDuration="6.528500602s" podCreationTimestamp="2026-03-13 14:11:24 +0000 UTC" firstStartedPulling="2026-03-13 14:11:27.183920212 +0000 UTC m=+922.185508451" lastFinishedPulling="2026-03-13 14:11:30.250847708 +0000 UTC m=+925.252435947" observedRunningTime="2026-03-13 14:11:30.522773675 +0000 UTC m=+925.524361914" watchObservedRunningTime="2026-03-13 14:11:30.528500602 +0000 UTC m=+925.530088841" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.275693 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-vvj56"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.278710 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.281438 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.281449 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.281689 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.281635 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-gqrj6" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.281687 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.291502 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-vvj56"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.414403 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.415285 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.418335 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.418555 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.427203 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.427351 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.448636 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.448696 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.448716 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cl5z\" (UniqueName: \"kubernetes.io/projected/510657b4-32e2-4fa5-9c09-17869a295736-kube-api-access-9cl5z\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.448757 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.448797 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/510657b4-32e2-4fa5-9c09-17869a295736-config\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.486925 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.489807 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.493233 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.493688 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.543614 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550073 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550124 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550155 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rnk9\" (UniqueName: \"kubernetes.io/projected/5e81d88f-c63b-4f0c-ba17-f1171350c28d-kube-api-access-9rnk9\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550387 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/510657b4-32e2-4fa5-9c09-17869a295736-config\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550469 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e81d88f-c63b-4f0c-ba17-f1171350c28d-config\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550514 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550597 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550654 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550721 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550758 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cl5z\" (UniqueName: \"kubernetes.io/projected/510657b4-32e2-4fa5-9c09-17869a295736-kube-api-access-9cl5z\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550793 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.551327 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.551539 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/510657b4-32e2-4fa5-9c09-17869a295736-config\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.555799 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.556237 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.579702 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cl5z\" (UniqueName: \"kubernetes.io/projected/510657b4-32e2-4fa5-9c09-17869a295736-kube-api-access-9cl5z\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.604185 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.606192 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.606226 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.609516 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.609940 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.610031 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.611082 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.613994 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.617735 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.619123 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.621130 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-56dfj" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.626109 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652313 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbxbn\" (UniqueName: \"kubernetes.io/projected/e519fed6-a687-4a01-a979-598e81122ad1-kube-api-access-dbxbn\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652369 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e519fed6-a687-4a01-a979-598e81122ad1-config\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652397 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652432 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652458 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652488 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652510 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rnk9\" (UniqueName: \"kubernetes.io/projected/5e81d88f-c63b-4f0c-ba17-f1171350c28d-kube-api-access-9rnk9\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652540 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652557 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652581 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e81d88f-c63b-4f0c-ba17-f1171350c28d-config\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652600 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.649890 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.657769 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.658285 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e81d88f-c63b-4f0c-ba17-f1171350c28d-config\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.661408 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.664369 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.677764 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rnk9\" (UniqueName: \"kubernetes.io/projected/5e81d88f-c63b-4f0c-ba17-f1171350c28d-kube-api-access-9rnk9\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.679633 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.737080 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753673 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753720 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e519fed6-a687-4a01-a979-598e81122ad1-config\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753742 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tls-secret\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753763 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tenants\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753789 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-lokistack-gateway\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753809 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753826 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-lokistack-gateway\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753840 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-rbac\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753855 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tenants\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753874 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-rbac\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753919 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753954 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tls-secret\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753969 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753984 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9b9d\" (UniqueName: \"kubernetes.io/projected/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-kube-api-access-l9b9d\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.754008 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.754034 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.754055 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.754071 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.754101 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z6th\" (UniqueName: \"kubernetes.io/projected/077fcbe8-c497-44b4-82f9-ff8e317cbe83-kube-api-access-4z6th\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.754118 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.754135 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbxbn\" (UniqueName: \"kubernetes.io/projected/e519fed6-a687-4a01-a979-598e81122ad1-kube-api-access-dbxbn\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.756812 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e519fed6-a687-4a01-a979-598e81122ad1-config\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.757638 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.760077 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.778281 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbxbn\" (UniqueName: \"kubernetes.io/projected/e519fed6-a687-4a01-a979-598e81122ad1-kube-api-access-dbxbn\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.790590 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.806327 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855683 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-rbac\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855715 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tenants\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855740 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-rbac\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855780 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tls-secret\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855799 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855815 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9b9d\" (UniqueName: \"kubernetes.io/projected/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-kube-api-access-l9b9d\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855846 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855875 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855934 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z6th\" (UniqueName: \"kubernetes.io/projected/077fcbe8-c497-44b4-82f9-ff8e317cbe83-kube-api-access-4z6th\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855963 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855990 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.856047 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tls-secret\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.856064 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tenants\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.856086 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-lokistack-gateway\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.856104 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.856122 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-lokistack-gateway\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.856748 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-rbac\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.857044 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-lokistack-gateway\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.857953 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.858461 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: E0313 14:11:36.858530 4898 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Mar 13 14:11:36 crc kubenswrapper[4898]: E0313 14:11:36.858576 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tls-secret podName:077fcbe8-c497-44b4-82f9-ff8e317cbe83 nodeName:}" failed. No retries permitted until 2026-03-13 14:11:37.358561552 +0000 UTC m=+932.360149791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tls-secret") pod "logging-loki-gateway-c6d797ccf-9qh4r" (UID: "077fcbe8-c497-44b4-82f9-ff8e317cbe83") : secret "logging-loki-gateway-http" not found Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.859869 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: E0313 14:11:36.860098 4898 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Mar 13 14:11:36 crc kubenswrapper[4898]: E0313 14:11:36.860436 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tls-secret podName:13ee53e6-2549-4dd8-91ac-80e4ef2c9d99 nodeName:}" failed. No retries permitted until 2026-03-13 14:11:37.36041552 +0000 UTC m=+932.362003779 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tls-secret") pod "logging-loki-gateway-c6d797ccf-8ng9x" (UID: "13ee53e6-2549-4dd8-91ac-80e4ef2c9d99") : secret "logging-loki-gateway-http" not found Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.860725 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.861528 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.862177 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-lokistack-gateway\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.862916 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-rbac\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.865183 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tenants\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.866399 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.870169 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tenants\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.892185 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z6th\" (UniqueName: \"kubernetes.io/projected/077fcbe8-c497-44b4-82f9-ff8e317cbe83-kube-api-access-4z6th\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.892867 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9b9d\" (UniqueName: \"kubernetes.io/projected/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-kube-api-access-l9b9d\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.154659 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-vvj56"] Mar 13 14:11:37 crc kubenswrapper[4898]: W0313 14:11:37.159138 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod510657b4_32e2_4fa5_9c09_17869a295736.slice/crio-588ef4ead368ab2e5638371f5bbdf6b639ba8c8fe67b8e61bb03ae9fc3e68a34 WatchSource:0}: Error finding container 588ef4ead368ab2e5638371f5bbdf6b639ba8c8fe67b8e61bb03ae9fc3e68a34: Status 404 returned error can't find the container with id 588ef4ead368ab2e5638371f5bbdf6b639ba8c8fe67b8e61bb03ae9fc3e68a34 Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.222741 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw"] Mar 13 14:11:37 crc kubenswrapper[4898]: W0313 14:11:37.230031 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e81d88f_c63b_4f0c_ba17_f1171350c28d.slice/crio-506b3c6c7a6fe4b77d437078eecd30760f329a4c67a298239690107de86bbf68 WatchSource:0}: Error finding container 506b3c6c7a6fe4b77d437078eecd30760f329a4c67a298239690107de86bbf68: Status 404 returned error can't find the container with id 506b3c6c7a6fe4b77d437078eecd30760f329a4c67a298239690107de86bbf68 Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.271235 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz"] Mar 13 14:11:37 crc kubenswrapper[4898]: W0313 14:11:37.281271 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode519fed6_a687_4a01_a979_598e81122ad1.slice/crio-3ec77fb340706082cc184c315dd056ae13f90d153c10d3f4f8a73f9bcbd8a242 WatchSource:0}: Error finding container 3ec77fb340706082cc184c315dd056ae13f90d153c10d3f4f8a73f9bcbd8a242: Status 404 returned error can't find the container with id 3ec77fb340706082cc184c315dd056ae13f90d153c10d3f4f8a73f9bcbd8a242 Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.366240 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tls-secret\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.366350 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tls-secret\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.370873 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tls-secret\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.373647 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tls-secret\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.410272 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.411681 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.416085 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.416669 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.418394 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.464714 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.466405 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.468358 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.468457 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.489343 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.531024 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.560220 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.561797 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.566802 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.567087 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.569953 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570002 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-13c35950-5bd9-4551-b262-990db297da3e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13c35950-5bd9-4551-b262-990db297da3e\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570037 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-49c190c1-aab1-4988-aa72-b6786999be24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49c190c1-aab1-4988-aa72-b6786999be24\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570078 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570117 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570138 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570186 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5fee8d-2246-4e34-8ddd-ce710e155d73-config\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570208 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570230 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570258 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-15ed4531-bf9c-4d7a-a111-9dbf850db61b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ed4531-bf9c-4d7a-a111-9dbf850db61b\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570278 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7xx5\" (UniqueName: \"kubernetes.io/projected/2194d847-4858-4f46-ab8b-c2d78cf5677e-kube-api-access-z7xx5\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570307 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpqdw\" (UniqueName: \"kubernetes.io/projected/9c5fee8d-2246-4e34-8ddd-ce710e155d73-kube-api-access-zpqdw\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570330 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2194d847-4858-4f46-ab8b-c2d78cf5677e-config\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570357 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570391 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.577670 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.582774 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" event={"ID":"5e81d88f-c63b-4f0c-ba17-f1171350c28d","Type":"ContainerStarted","Data":"506b3c6c7a6fe4b77d437078eecd30760f329a4c67a298239690107de86bbf68"} Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.584537 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" event={"ID":"510657b4-32e2-4fa5-9c09-17869a295736","Type":"ContainerStarted","Data":"588ef4ead368ab2e5638371f5bbdf6b639ba8c8fe67b8e61bb03ae9fc3e68a34"} Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.592537 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" event={"ID":"e519fed6-a687-4a01-a979-598e81122ad1","Type":"ContainerStarted","Data":"3ec77fb340706082cc184c315dd056ae13f90d153c10d3f4f8a73f9bcbd8a242"} Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.620258 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.673745 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.673808 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5fee8d-2246-4e34-8ddd-ce710e155d73-config\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.673830 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.673878 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.673919 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.673958 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.673983 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-13c35950-5bd9-4551-b262-990db297da3e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13c35950-5bd9-4551-b262-990db297da3e\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674010 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-49c190c1-aab1-4988-aa72-b6786999be24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49c190c1-aab1-4988-aa72-b6786999be24\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674035 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8606fd04-7f35-4896-8750-940576cee64d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8606fd04-7f35-4896-8750-940576cee64d\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674060 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674086 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674111 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674142 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674166 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-15ed4531-bf9c-4d7a-a111-9dbf850db61b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ed4531-bf9c-4d7a-a111-9dbf850db61b\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674186 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7xx5\" (UniqueName: \"kubernetes.io/projected/2194d847-4858-4f46-ab8b-c2d78cf5677e-kube-api-access-z7xx5\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674210 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpqdw\" (UniqueName: \"kubernetes.io/projected/9c5fee8d-2246-4e34-8ddd-ce710e155d73-kube-api-access-zpqdw\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674230 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2194d847-4858-4f46-ab8b-c2d78cf5677e-config\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674252 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674288 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b865\" (UniqueName: \"kubernetes.io/projected/6a1df267-1145-4fe1-9455-57df3d043e3a-kube-api-access-9b865\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674311 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674338 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674361 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a1df267-1145-4fe1-9455-57df3d043e3a-config\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.676194 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5fee8d-2246-4e34-8ddd-ce710e155d73-config\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.681422 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.681556 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2194d847-4858-4f46-ab8b-c2d78cf5677e-config\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.688549 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.689003 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.689496 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.693721 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.703881 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.704476 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.704504 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-13c35950-5bd9-4551-b262-990db297da3e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13c35950-5bd9-4551-b262-990db297da3e\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/26bedc4c1bfff6b9b760c8762a5570d09416afeb434fd9c5f3285e55a6b76a3f/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.705103 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.705127 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-15ed4531-bf9c-4d7a-a111-9dbf850db61b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ed4531-bf9c-4d7a-a111-9dbf850db61b\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/016fef35710af0d853b02b8986f3c982d52ae12cf5f6a1b2c3955701e9084c35/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.706497 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.706521 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-49c190c1-aab1-4988-aa72-b6786999be24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49c190c1-aab1-4988-aa72-b6786999be24\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e38058b9a8e2fce17e58dc2f056548d59a67467bfbb80e5ac42f189464a94097/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.718741 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.721560 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpqdw\" (UniqueName: \"kubernetes.io/projected/9c5fee8d-2246-4e34-8ddd-ce710e155d73-kube-api-access-zpqdw\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.726090 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7xx5\" (UniqueName: \"kubernetes.io/projected/2194d847-4858-4f46-ab8b-c2d78cf5677e-kube-api-access-z7xx5\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.736880 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.778844 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a1df267-1145-4fe1-9455-57df3d043e3a-config\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.778994 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.779029 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.779063 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8606fd04-7f35-4896-8750-940576cee64d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8606fd04-7f35-4896-8750-940576cee64d\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.779087 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.779163 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b865\" (UniqueName: \"kubernetes.io/projected/6a1df267-1145-4fe1-9455-57df3d043e3a-kube-api-access-9b865\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.779178 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.781935 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.789590 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a1df267-1145-4fe1-9455-57df3d043e3a-config\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.793480 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.832912 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.833666 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.838137 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.838180 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8606fd04-7f35-4896-8750-940576cee64d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8606fd04-7f35-4896-8750-940576cee64d\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cd4ecb341920b75f201d3b77609b4adaa1255e86603e4072d1d6e42d9336ac62/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.838594 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b865\" (UniqueName: \"kubernetes.io/projected/6a1df267-1145-4fe1-9455-57df3d043e3a-kube-api-access-9b865\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.874628 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-13c35950-5bd9-4551-b262-990db297da3e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13c35950-5bd9-4551-b262-990db297da3e\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.879372 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-49c190c1-aab1-4988-aa72-b6786999be24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49c190c1-aab1-4988-aa72-b6786999be24\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.881864 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8606fd04-7f35-4896-8750-940576cee64d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8606fd04-7f35-4896-8750-940576cee64d\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.885659 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-15ed4531-bf9c-4d7a-a111-9dbf850db61b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ed4531-bf9c-4d7a-a111-9dbf850db61b\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.941256 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.031334 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.088612 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.195091 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r"] Mar 13 14:11:38 crc kubenswrapper[4898]: W0313 14:11:38.214585 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod077fcbe8_c497_44b4_82f9_ff8e317cbe83.slice/crio-f50e2c5a54d1997f4cda6ef4178ad487333556f8d2cda9de1e6ca463c19b00ac WatchSource:0}: Error finding container f50e2c5a54d1997f4cda6ef4178ad487333556f8d2cda9de1e6ca463c19b00ac: Status 404 returned error can't find the container with id f50e2c5a54d1997f4cda6ef4178ad487333556f8d2cda9de1e6ca463c19b00ac Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.249017 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x"] Mar 13 14:11:38 crc kubenswrapper[4898]: W0313 14:11:38.253010 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13ee53e6_2549_4dd8_91ac_80e4ef2c9d99.slice/crio-d4e6f0db0fb92bf509209da909097bd4a98e043a22823219e25307febd47f969 WatchSource:0}: Error finding container d4e6f0db0fb92bf509209da909097bd4a98e043a22823219e25307febd47f969: Status 404 returned error can't find the container with id d4e6f0db0fb92bf509209da909097bd4a98e043a22823219e25307febd47f969 Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.365045 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 13 14:11:38 crc kubenswrapper[4898]: W0313 14:11:38.372445 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1df267_1145_4fe1_9455_57df3d043e3a.slice/crio-1083cfac66719ac52de747eb490bc373e33b01f1717b2da53d7b9fd6397a1a10 WatchSource:0}: Error finding container 1083cfac66719ac52de747eb490bc373e33b01f1717b2da53d7b9fd6397a1a10: Status 404 returned error can't find the container with id 1083cfac66719ac52de747eb490bc373e33b01f1717b2da53d7b9fd6397a1a10 Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.474415 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.543025 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.601514 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"6a1df267-1145-4fe1-9455-57df3d043e3a","Type":"ContainerStarted","Data":"1083cfac66719ac52de747eb490bc373e33b01f1717b2da53d7b9fd6397a1a10"} Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.602845 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"2194d847-4858-4f46-ab8b-c2d78cf5677e","Type":"ContainerStarted","Data":"ee1f39ee6f8150eb25864a4baa6d2a69f9e4a6977b91fc642c6e3a79f5142dc0"} Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.604058 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" event={"ID":"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99","Type":"ContainerStarted","Data":"d4e6f0db0fb92bf509209da909097bd4a98e043a22823219e25307febd47f969"} Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.605393 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"9c5fee8d-2246-4e34-8ddd-ce710e155d73","Type":"ContainerStarted","Data":"3bf1ce175d38c42a8ebc947669edd6e8fdc8bcff57ba456df7fabf96ebc626fb"} Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.607113 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" event={"ID":"077fcbe8-c497-44b4-82f9-ff8e317cbe83","Type":"ContainerStarted","Data":"f50e2c5a54d1997f4cda6ef4178ad487333556f8d2cda9de1e6ca463c19b00ac"} Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.627048 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"2194d847-4858-4f46-ab8b-c2d78cf5677e","Type":"ContainerStarted","Data":"51a34eecfff5bcd76e412488f2f647a11e51123d8bb67c5a49eb859212690fa7"} Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.627863 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.632713 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" event={"ID":"5e81d88f-c63b-4f0c-ba17-f1171350c28d","Type":"ContainerStarted","Data":"05de7a1630ecce8bb89db3fddb88fc8886f0a968022649e8dc17f53ced676a33"} Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.632800 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.642882 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" event={"ID":"510657b4-32e2-4fa5-9c09-17869a295736","Type":"ContainerStarted","Data":"04c209ea08736554f11756ae1c12fd1213b662bea820893d317d4535954da8a3"} Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.643124 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.649032 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.041300067 podStartE2EDuration="4.649009235s" podCreationTimestamp="2026-03-13 14:11:36 +0000 UTC" firstStartedPulling="2026-03-13 14:11:38.490757501 +0000 UTC m=+933.492345740" lastFinishedPulling="2026-03-13 14:11:40.098466679 +0000 UTC m=+935.100054908" observedRunningTime="2026-03-13 14:11:40.648535903 +0000 UTC m=+935.650124162" watchObservedRunningTime="2026-03-13 14:11:40.649009235 +0000 UTC m=+935.650597474" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.654313 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"9c5fee8d-2246-4e34-8ddd-ce710e155d73","Type":"ContainerStarted","Data":"b3c9648de0517f31a7ed9f100e721331593c713852257f57a4ccdacd68fa8784"} Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.654400 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.672673 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" podStartSLOduration=1.7771506179999998 podStartE2EDuration="4.672655223s" podCreationTimestamp="2026-03-13 14:11:36 +0000 UTC" firstStartedPulling="2026-03-13 14:11:37.233092799 +0000 UTC m=+932.234681038" lastFinishedPulling="2026-03-13 14:11:40.128597404 +0000 UTC m=+935.130185643" observedRunningTime="2026-03-13 14:11:40.667019258 +0000 UTC m=+935.668607507" watchObservedRunningTime="2026-03-13 14:11:40.672655223 +0000 UTC m=+935.674243462" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.678370 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" event={"ID":"e519fed6-a687-4a01-a979-598e81122ad1","Type":"ContainerStarted","Data":"40d45b908d4063d63b06ce169c42a918c60d9804baaaf0dc0fc2eebdb0e61d6f"} Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.678452 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.680809 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"6a1df267-1145-4fe1-9455-57df3d043e3a","Type":"ContainerStarted","Data":"d0a283a5eb4a2ba6a67b1232156e05c6727addae52bb5b5751da0673e217bf3a"} Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.681501 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.711009 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" podStartSLOduration=1.782903647 podStartE2EDuration="4.7109897s" podCreationTimestamp="2026-03-13 14:11:36 +0000 UTC" firstStartedPulling="2026-03-13 14:11:37.161131208 +0000 UTC m=+932.162719447" lastFinishedPulling="2026-03-13 14:11:40.089217261 +0000 UTC m=+935.090805500" observedRunningTime="2026-03-13 14:11:40.708033324 +0000 UTC m=+935.709621583" watchObservedRunningTime="2026-03-13 14:11:40.7109897 +0000 UTC m=+935.712577949" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.738128 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.168368207 podStartE2EDuration="4.738107388s" podCreationTimestamp="2026-03-13 14:11:36 +0000 UTC" firstStartedPulling="2026-03-13 14:11:38.560013103 +0000 UTC m=+933.561601342" lastFinishedPulling="2026-03-13 14:11:40.129752284 +0000 UTC m=+935.131340523" observedRunningTime="2026-03-13 14:11:40.72849744 +0000 UTC m=+935.730085689" watchObservedRunningTime="2026-03-13 14:11:40.738107388 +0000 UTC m=+935.739695627" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.770217 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" podStartSLOduration=1.926409959 podStartE2EDuration="4.770194883s" podCreationTimestamp="2026-03-13 14:11:36 +0000 UTC" firstStartedPulling="2026-03-13 14:11:37.283991819 +0000 UTC m=+932.285580068" lastFinishedPulling="2026-03-13 14:11:40.127776753 +0000 UTC m=+935.129364992" observedRunningTime="2026-03-13 14:11:40.745702053 +0000 UTC m=+935.747290312" watchObservedRunningTime="2026-03-13 14:11:40.770194883 +0000 UTC m=+935.771783122" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.788641 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.036833172 podStartE2EDuration="4.788618557s" podCreationTimestamp="2026-03-13 14:11:36 +0000 UTC" firstStartedPulling="2026-03-13 14:11:38.375387142 +0000 UTC m=+933.376975381" lastFinishedPulling="2026-03-13 14:11:40.127172527 +0000 UTC m=+935.128760766" observedRunningTime="2026-03-13 14:11:40.762839434 +0000 UTC m=+935.764427683" watchObservedRunningTime="2026-03-13 14:11:40.788618557 +0000 UTC m=+935.790206796" Mar 13 14:11:42 crc kubenswrapper[4898]: I0313 14:11:42.754661 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" event={"ID":"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99","Type":"ContainerStarted","Data":"fcf8e9a8d3e5e7f7951cb2af14626b6e5da83135311a91da93925d8b17fce58e"} Mar 13 14:11:42 crc kubenswrapper[4898]: I0313 14:11:42.756958 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" event={"ID":"077fcbe8-c497-44b4-82f9-ff8e317cbe83","Type":"ContainerStarted","Data":"ce5d448a85a6fe29bfd4df4184532deeda10f71ada8ffc4d7a6222cbefeeb36f"} Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.772468 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" event={"ID":"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99","Type":"ContainerStarted","Data":"149addd5742530e63e0bc98feb66cfd6d0db4bf52fa1f9082406acdbde10302b"} Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.773148 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.773168 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.775428 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" event={"ID":"077fcbe8-c497-44b4-82f9-ff8e317cbe83","Type":"ContainerStarted","Data":"dfa779b5c6a3bc447019fc31e561acce75042dc2e0deee52bceb23bc194bec24"} Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.776063 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.785668 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.787475 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.793022 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.804316 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podStartSLOduration=2.743102504 podStartE2EDuration="8.804295786s" podCreationTimestamp="2026-03-13 14:11:36 +0000 UTC" firstStartedPulling="2026-03-13 14:11:38.257445947 +0000 UTC m=+933.259034186" lastFinishedPulling="2026-03-13 14:11:44.318639229 +0000 UTC m=+939.320227468" observedRunningTime="2026-03-13 14:11:44.799292137 +0000 UTC m=+939.800880386" watchObservedRunningTime="2026-03-13 14:11:44.804295786 +0000 UTC m=+939.805884045" Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.838678 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podStartSLOduration=2.744409508 podStartE2EDuration="8.83865849s" podCreationTimestamp="2026-03-13 14:11:36 +0000 UTC" firstStartedPulling="2026-03-13 14:11:38.218364502 +0000 UTC m=+933.219952731" lastFinishedPulling="2026-03-13 14:11:44.312613474 +0000 UTC m=+939.314201713" observedRunningTime="2026-03-13 14:11:44.834469822 +0000 UTC m=+939.836058081" watchObservedRunningTime="2026-03-13 14:11:44.83865849 +0000 UTC m=+939.840246729" Mar 13 14:11:45 crc kubenswrapper[4898]: I0313 14:11:45.784514 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:45 crc kubenswrapper[4898]: I0313 14:11:45.806418 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:56 crc kubenswrapper[4898]: I0313 14:11:56.611499 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:56 crc kubenswrapper[4898]: I0313 14:11:56.743468 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:56 crc kubenswrapper[4898]: I0313 14:11:56.816346 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:57 crc kubenswrapper[4898]: I0313 14:11:57.948912 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:58 crc kubenswrapper[4898]: I0313 14:11:58.041451 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 13 14:11:58 crc kubenswrapper[4898]: I0313 14:11:58.041505 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="2194d847-4858-4f46-ab8b-c2d78cf5677e" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 13 14:11:58 crc kubenswrapper[4898]: I0313 14:11:58.095964 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.137434 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556852-z8l5q"] Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.139392 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556852-z8l5q" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.141824 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.142106 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.142628 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.144865 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556852-z8l5q"] Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.261072 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r75v\" (UniqueName: \"kubernetes.io/projected/da054881-deef-4491-9685-5f35ee9fc45f-kube-api-access-6r75v\") pod \"auto-csr-approver-29556852-z8l5q\" (UID: \"da054881-deef-4491-9685-5f35ee9fc45f\") " pod="openshift-infra/auto-csr-approver-29556852-z8l5q" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.362940 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r75v\" (UniqueName: \"kubernetes.io/projected/da054881-deef-4491-9685-5f35ee9fc45f-kube-api-access-6r75v\") pod \"auto-csr-approver-29556852-z8l5q\" (UID: \"da054881-deef-4491-9685-5f35ee9fc45f\") " pod="openshift-infra/auto-csr-approver-29556852-z8l5q" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.394131 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r75v\" (UniqueName: \"kubernetes.io/projected/da054881-deef-4491-9685-5f35ee9fc45f-kube-api-access-6r75v\") pod \"auto-csr-approver-29556852-z8l5q\" (UID: \"da054881-deef-4491-9685-5f35ee9fc45f\") " pod="openshift-infra/auto-csr-approver-29556852-z8l5q" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.463597 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556852-z8l5q" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.727930 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556852-z8l5q"] Mar 13 14:12:00 crc kubenswrapper[4898]: W0313 14:12:00.731059 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda054881_deef_4491_9685_5f35ee9fc45f.slice/crio-f7218a8006a20357a5e70b15e04df0fe0dbc1453185a2df0602bc419da843703 WatchSource:0}: Error finding container f7218a8006a20357a5e70b15e04df0fe0dbc1453185a2df0602bc419da843703: Status 404 returned error can't find the container with id f7218a8006a20357a5e70b15e04df0fe0dbc1453185a2df0602bc419da843703 Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.938481 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556852-z8l5q" event={"ID":"da054881-deef-4491-9685-5f35ee9fc45f","Type":"ContainerStarted","Data":"f7218a8006a20357a5e70b15e04df0fe0dbc1453185a2df0602bc419da843703"} Mar 13 14:12:02 crc kubenswrapper[4898]: I0313 14:12:02.957115 4898 generic.go:334] "Generic (PLEG): container finished" podID="da054881-deef-4491-9685-5f35ee9fc45f" containerID="5010d4732869bc8d7e0532b7c193085ea8336efd3a5d0f8cd686f3caf758e4d9" exitCode=0 Mar 13 14:12:02 crc kubenswrapper[4898]: I0313 14:12:02.957167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556852-z8l5q" event={"ID":"da054881-deef-4491-9685-5f35ee9fc45f","Type":"ContainerDied","Data":"5010d4732869bc8d7e0532b7c193085ea8336efd3a5d0f8cd686f3caf758e4d9"} Mar 13 14:12:04 crc kubenswrapper[4898]: I0313 14:12:04.307017 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556852-z8l5q" Mar 13 14:12:04 crc kubenswrapper[4898]: I0313 14:12:04.434455 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r75v\" (UniqueName: \"kubernetes.io/projected/da054881-deef-4491-9685-5f35ee9fc45f-kube-api-access-6r75v\") pod \"da054881-deef-4491-9685-5f35ee9fc45f\" (UID: \"da054881-deef-4491-9685-5f35ee9fc45f\") " Mar 13 14:12:04 crc kubenswrapper[4898]: I0313 14:12:04.441630 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da054881-deef-4491-9685-5f35ee9fc45f-kube-api-access-6r75v" (OuterVolumeSpecName: "kube-api-access-6r75v") pod "da054881-deef-4491-9685-5f35ee9fc45f" (UID: "da054881-deef-4491-9685-5f35ee9fc45f"). InnerVolumeSpecName "kube-api-access-6r75v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:12:04 crc kubenswrapper[4898]: I0313 14:12:04.537172 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r75v\" (UniqueName: \"kubernetes.io/projected/da054881-deef-4491-9685-5f35ee9fc45f-kube-api-access-6r75v\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:04 crc kubenswrapper[4898]: I0313 14:12:04.975395 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556852-z8l5q" event={"ID":"da054881-deef-4491-9685-5f35ee9fc45f","Type":"ContainerDied","Data":"f7218a8006a20357a5e70b15e04df0fe0dbc1453185a2df0602bc419da843703"} Mar 13 14:12:04 crc kubenswrapper[4898]: I0313 14:12:04.975436 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7218a8006a20357a5e70b15e04df0fe0dbc1453185a2df0602bc419da843703" Mar 13 14:12:04 crc kubenswrapper[4898]: I0313 14:12:04.975509 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556852-z8l5q" Mar 13 14:12:05 crc kubenswrapper[4898]: I0313 14:12:05.385686 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556846-d6zpp"] Mar 13 14:12:05 crc kubenswrapper[4898]: I0313 14:12:05.391753 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556846-d6zpp"] Mar 13 14:12:05 crc kubenswrapper[4898]: I0313 14:12:05.757860 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="666e4c5d-e464-4b8a-b167-bc7624fc3e10" path="/var/lib/kubelet/pods/666e4c5d-e464-4b8a-b167-bc7624fc3e10/volumes" Mar 13 14:12:08 crc kubenswrapper[4898]: I0313 14:12:08.040424 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 13 14:12:08 crc kubenswrapper[4898]: I0313 14:12:08.040501 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="2194d847-4858-4f46-ab8b-c2d78cf5677e" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 13 14:12:18 crc kubenswrapper[4898]: I0313 14:12:18.067170 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 13 14:12:18 crc kubenswrapper[4898]: I0313 14:12:18.067855 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="2194d847-4858-4f46-ab8b-c2d78cf5677e" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 13 14:12:18 crc kubenswrapper[4898]: I0313 14:12:18.877706 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x8rbg"] Mar 13 14:12:18 crc kubenswrapper[4898]: E0313 14:12:18.878874 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da054881-deef-4491-9685-5f35ee9fc45f" containerName="oc" Mar 13 14:12:18 crc kubenswrapper[4898]: I0313 14:12:18.878930 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="da054881-deef-4491-9685-5f35ee9fc45f" containerName="oc" Mar 13 14:12:18 crc kubenswrapper[4898]: I0313 14:12:18.879167 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="da054881-deef-4491-9685-5f35ee9fc45f" containerName="oc" Mar 13 14:12:18 crc kubenswrapper[4898]: I0313 14:12:18.881167 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:18 crc kubenswrapper[4898]: I0313 14:12:18.885862 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8rbg"] Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.019532 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-utilities\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.019998 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-catalog-content\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.020103 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvt8c\" (UniqueName: \"kubernetes.io/projected/715d729a-a993-4a3a-98a2-58f904ef7f6b-kube-api-access-rvt8c\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.121890 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-catalog-content\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.122044 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvt8c\" (UniqueName: \"kubernetes.io/projected/715d729a-a993-4a3a-98a2-58f904ef7f6b-kube-api-access-rvt8c\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.122183 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-utilities\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.122802 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-catalog-content\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.122887 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-utilities\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.134487 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.134577 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.154276 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvt8c\" (UniqueName: \"kubernetes.io/projected/715d729a-a993-4a3a-98a2-58f904ef7f6b-kube-api-access-rvt8c\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.209217 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.637350 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8rbg"] Mar 13 14:12:20 crc kubenswrapper[4898]: I0313 14:12:20.105418 4898 generic.go:334] "Generic (PLEG): container finished" podID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerID="03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab" exitCode=0 Mar 13 14:12:20 crc kubenswrapper[4898]: I0313 14:12:20.105486 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8rbg" event={"ID":"715d729a-a993-4a3a-98a2-58f904ef7f6b","Type":"ContainerDied","Data":"03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab"} Mar 13 14:12:20 crc kubenswrapper[4898]: I0313 14:12:20.105522 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8rbg" event={"ID":"715d729a-a993-4a3a-98a2-58f904ef7f6b","Type":"ContainerStarted","Data":"9b9efbda102cfb59931b3b2dfe9347c23bb3d5bb9a19770896f214f7234be802"} Mar 13 14:12:23 crc kubenswrapper[4898]: I0313 14:12:23.133664 4898 generic.go:334] "Generic (PLEG): container finished" podID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerID="2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93" exitCode=0 Mar 13 14:12:23 crc kubenswrapper[4898]: I0313 14:12:23.133777 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8rbg" event={"ID":"715d729a-a993-4a3a-98a2-58f904ef7f6b","Type":"ContainerDied","Data":"2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93"} Mar 13 14:12:26 crc kubenswrapper[4898]: I0313 14:12:26.158501 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8rbg" event={"ID":"715d729a-a993-4a3a-98a2-58f904ef7f6b","Type":"ContainerStarted","Data":"1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf"} Mar 13 14:12:28 crc kubenswrapper[4898]: I0313 14:12:28.038120 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 13 14:12:28 crc kubenswrapper[4898]: I0313 14:12:28.038509 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="2194d847-4858-4f46-ab8b-c2d78cf5677e" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.210455 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.210504 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.283656 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.320477 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x8rbg" podStartSLOduration=6.4356755 podStartE2EDuration="11.32045733s" podCreationTimestamp="2026-03-13 14:12:18 +0000 UTC" firstStartedPulling="2026-03-13 14:12:20.107315926 +0000 UTC m=+975.108904175" lastFinishedPulling="2026-03-13 14:12:24.992097726 +0000 UTC m=+979.993686005" observedRunningTime="2026-03-13 14:12:26.185429122 +0000 UTC m=+981.187017371" watchObservedRunningTime="2026-03-13 14:12:29.32045733 +0000 UTC m=+984.322045569" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.536675 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8vtbq"] Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.538574 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.555627 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vtbq"] Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.703589 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-utilities\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.703708 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-catalog-content\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.703949 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmzfb\" (UniqueName: \"kubernetes.io/projected/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-kube-api-access-xmzfb\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.806301 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-utilities\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.806366 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-catalog-content\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.806403 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmzfb\" (UniqueName: \"kubernetes.io/projected/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-kube-api-access-xmzfb\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.807113 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-catalog-content\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.807128 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-utilities\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.830676 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmzfb\" (UniqueName: \"kubernetes.io/projected/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-kube-api-access-xmzfb\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.857409 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:30 crc kubenswrapper[4898]: I0313 14:12:30.236511 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:30 crc kubenswrapper[4898]: I0313 14:12:30.389360 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vtbq"] Mar 13 14:12:31 crc kubenswrapper[4898]: I0313 14:12:31.212380 4898 generic.go:334] "Generic (PLEG): container finished" podID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerID="b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84" exitCode=0 Mar 13 14:12:31 crc kubenswrapper[4898]: I0313 14:12:31.212610 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vtbq" event={"ID":"fd6314a3-ad6a-48ea-b54a-a2d1415b287e","Type":"ContainerDied","Data":"b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84"} Mar 13 14:12:31 crc kubenswrapper[4898]: I0313 14:12:31.212715 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vtbq" event={"ID":"fd6314a3-ad6a-48ea-b54a-a2d1415b287e","Type":"ContainerStarted","Data":"04b8094af8db214b2ee0dda013e626767a66d4ace84639cedc7f083561522032"} Mar 13 14:12:32 crc kubenswrapper[4898]: I0313 14:12:32.500867 4898 scope.go:117] "RemoveContainer" containerID="9c70e0bed8678da48508773f6b5163cca47cd975b196edd773fb1f955ef9672b" Mar 13 14:12:32 crc kubenswrapper[4898]: I0313 14:12:32.542448 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8rbg"] Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.230327 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x8rbg" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerName="registry-server" containerID="cri-o://1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf" gracePeriod=2 Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.647020 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.775377 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-catalog-content\") pod \"715d729a-a993-4a3a-98a2-58f904ef7f6b\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.775454 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvt8c\" (UniqueName: \"kubernetes.io/projected/715d729a-a993-4a3a-98a2-58f904ef7f6b-kube-api-access-rvt8c\") pod \"715d729a-a993-4a3a-98a2-58f904ef7f6b\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.775497 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-utilities\") pod \"715d729a-a993-4a3a-98a2-58f904ef7f6b\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.777295 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-utilities" (OuterVolumeSpecName: "utilities") pod "715d729a-a993-4a3a-98a2-58f904ef7f6b" (UID: "715d729a-a993-4a3a-98a2-58f904ef7f6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.782890 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715d729a-a993-4a3a-98a2-58f904ef7f6b-kube-api-access-rvt8c" (OuterVolumeSpecName: "kube-api-access-rvt8c") pod "715d729a-a993-4a3a-98a2-58f904ef7f6b" (UID: "715d729a-a993-4a3a-98a2-58f904ef7f6b"). InnerVolumeSpecName "kube-api-access-rvt8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.805757 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "715d729a-a993-4a3a-98a2-58f904ef7f6b" (UID: "715d729a-a993-4a3a-98a2-58f904ef7f6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.878622 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.878684 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvt8c\" (UniqueName: \"kubernetes.io/projected/715d729a-a993-4a3a-98a2-58f904ef7f6b-kube-api-access-rvt8c\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.878700 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.240164 4898 generic.go:334] "Generic (PLEG): container finished" podID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerID="1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf" exitCode=0 Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.240206 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8rbg" event={"ID":"715d729a-a993-4a3a-98a2-58f904ef7f6b","Type":"ContainerDied","Data":"1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf"} Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.240235 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8rbg" event={"ID":"715d729a-a993-4a3a-98a2-58f904ef7f6b","Type":"ContainerDied","Data":"9b9efbda102cfb59931b3b2dfe9347c23bb3d5bb9a19770896f214f7234be802"} Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.240253 4898 scope.go:117] "RemoveContainer" containerID="1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.240844 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.272380 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8rbg"] Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.277497 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8rbg"] Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.286632 4898 scope.go:117] "RemoveContainer" containerID="2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.344316 4898 scope.go:117] "RemoveContainer" containerID="03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.385178 4898 scope.go:117] "RemoveContainer" containerID="1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf" Mar 13 14:12:34 crc kubenswrapper[4898]: E0313 14:12:34.385736 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf\": container with ID starting with 1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf not found: ID does not exist" containerID="1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.385773 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf"} err="failed to get container status \"1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf\": rpc error: code = NotFound desc = could not find container \"1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf\": container with ID starting with 1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf not found: ID does not exist" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.385798 4898 scope.go:117] "RemoveContainer" containerID="2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93" Mar 13 14:12:34 crc kubenswrapper[4898]: E0313 14:12:34.386257 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93\": container with ID starting with 2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93 not found: ID does not exist" containerID="2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.386317 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93"} err="failed to get container status \"2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93\": rpc error: code = NotFound desc = could not find container \"2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93\": container with ID starting with 2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93 not found: ID does not exist" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.386368 4898 scope.go:117] "RemoveContainer" containerID="03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab" Mar 13 14:12:34 crc kubenswrapper[4898]: E0313 14:12:34.386735 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab\": container with ID starting with 03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab not found: ID does not exist" containerID="03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.386810 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab"} err="failed to get container status \"03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab\": rpc error: code = NotFound desc = could not find container \"03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab\": container with ID starting with 03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab not found: ID does not exist" Mar 13 14:12:35 crc kubenswrapper[4898]: I0313 14:12:35.251601 4898 generic.go:334] "Generic (PLEG): container finished" podID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerID="55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7" exitCode=0 Mar 13 14:12:35 crc kubenswrapper[4898]: I0313 14:12:35.251695 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vtbq" event={"ID":"fd6314a3-ad6a-48ea-b54a-a2d1415b287e","Type":"ContainerDied","Data":"55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7"} Mar 13 14:12:35 crc kubenswrapper[4898]: I0313 14:12:35.749961 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" path="/var/lib/kubelet/pods/715d729a-a993-4a3a-98a2-58f904ef7f6b/volumes" Mar 13 14:12:38 crc kubenswrapper[4898]: I0313 14:12:38.043139 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:12:39 crc kubenswrapper[4898]: I0313 14:12:39.298840 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vtbq" event={"ID":"fd6314a3-ad6a-48ea-b54a-a2d1415b287e","Type":"ContainerStarted","Data":"a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615"} Mar 13 14:12:39 crc kubenswrapper[4898]: I0313 14:12:39.324355 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8vtbq" podStartSLOduration=3.8433358970000002 podStartE2EDuration="10.324330541s" podCreationTimestamp="2026-03-13 14:12:29 +0000 UTC" firstStartedPulling="2026-03-13 14:12:31.215354757 +0000 UTC m=+986.216943026" lastFinishedPulling="2026-03-13 14:12:37.696349391 +0000 UTC m=+992.697937670" observedRunningTime="2026-03-13 14:12:39.316249313 +0000 UTC m=+994.317837562" watchObservedRunningTime="2026-03-13 14:12:39.324330541 +0000 UTC m=+994.325918810" Mar 13 14:12:39 crc kubenswrapper[4898]: I0313 14:12:39.859039 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:39 crc kubenswrapper[4898]: I0313 14:12:39.875368 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:40 crc kubenswrapper[4898]: I0313 14:12:40.915273 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-8vtbq" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="registry-server" probeResult="failure" output=< Mar 13 14:12:40 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:12:40 crc kubenswrapper[4898]: > Mar 13 14:12:49 crc kubenswrapper[4898]: I0313 14:12:49.134968 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:12:49 crc kubenswrapper[4898]: I0313 14:12:49.135551 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:12:49 crc kubenswrapper[4898]: I0313 14:12:49.921728 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:49 crc kubenswrapper[4898]: I0313 14:12:49.976698 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:50 crc kubenswrapper[4898]: I0313 14:12:50.160570 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vtbq"] Mar 13 14:12:51 crc kubenswrapper[4898]: I0313 14:12:51.400823 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8vtbq" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="registry-server" containerID="cri-o://a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615" gracePeriod=2 Mar 13 14:12:51 crc kubenswrapper[4898]: I0313 14:12:51.785436 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:51 crc kubenswrapper[4898]: I0313 14:12:51.897647 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-catalog-content\") pod \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " Mar 13 14:12:51 crc kubenswrapper[4898]: I0313 14:12:51.897761 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-utilities\") pod \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " Mar 13 14:12:51 crc kubenswrapper[4898]: I0313 14:12:51.897833 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmzfb\" (UniqueName: \"kubernetes.io/projected/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-kube-api-access-xmzfb\") pod \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " Mar 13 14:12:51 crc kubenswrapper[4898]: I0313 14:12:51.899161 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-utilities" (OuterVolumeSpecName: "utilities") pod "fd6314a3-ad6a-48ea-b54a-a2d1415b287e" (UID: "fd6314a3-ad6a-48ea-b54a-a2d1415b287e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:12:51 crc kubenswrapper[4898]: I0313 14:12:51.902958 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-kube-api-access-xmzfb" (OuterVolumeSpecName: "kube-api-access-xmzfb") pod "fd6314a3-ad6a-48ea-b54a-a2d1415b287e" (UID: "fd6314a3-ad6a-48ea-b54a-a2d1415b287e"). InnerVolumeSpecName "kube-api-access-xmzfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:12:51 crc kubenswrapper[4898]: I0313 14:12:51.958417 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd6314a3-ad6a-48ea-b54a-a2d1415b287e" (UID: "fd6314a3-ad6a-48ea-b54a-a2d1415b287e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.000011 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.000047 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmzfb\" (UniqueName: \"kubernetes.io/projected/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-kube-api-access-xmzfb\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.000062 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.412619 4898 generic.go:334] "Generic (PLEG): container finished" podID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerID="a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615" exitCode=0 Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.412699 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.412729 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vtbq" event={"ID":"fd6314a3-ad6a-48ea-b54a-a2d1415b287e","Type":"ContainerDied","Data":"a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615"} Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.412924 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vtbq" event={"ID":"fd6314a3-ad6a-48ea-b54a-a2d1415b287e","Type":"ContainerDied","Data":"04b8094af8db214b2ee0dda013e626767a66d4ace84639cedc7f083561522032"} Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.412947 4898 scope.go:117] "RemoveContainer" containerID="a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.435865 4898 scope.go:117] "RemoveContainer" containerID="55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.470316 4898 scope.go:117] "RemoveContainer" containerID="b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.485431 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vtbq"] Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.498966 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8vtbq"] Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.501648 4898 scope.go:117] "RemoveContainer" containerID="a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615" Mar 13 14:12:52 crc kubenswrapper[4898]: E0313 14:12:52.502171 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615\": container with ID starting with a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615 not found: ID does not exist" containerID="a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.502243 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615"} err="failed to get container status \"a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615\": rpc error: code = NotFound desc = could not find container \"a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615\": container with ID starting with a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615 not found: ID does not exist" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.502284 4898 scope.go:117] "RemoveContainer" containerID="55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7" Mar 13 14:12:52 crc kubenswrapper[4898]: E0313 14:12:52.502730 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7\": container with ID starting with 55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7 not found: ID does not exist" containerID="55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.502806 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7"} err="failed to get container status \"55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7\": rpc error: code = NotFound desc = could not find container \"55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7\": container with ID starting with 55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7 not found: ID does not exist" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.502869 4898 scope.go:117] "RemoveContainer" containerID="b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84" Mar 13 14:12:52 crc kubenswrapper[4898]: E0313 14:12:52.503417 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84\": container with ID starting with b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84 not found: ID does not exist" containerID="b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.503448 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84"} err="failed to get container status \"b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84\": rpc error: code = NotFound desc = could not find container \"b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84\": container with ID starting with b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84 not found: ID does not exist" Mar 13 14:12:53 crc kubenswrapper[4898]: I0313 14:12:53.750695 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" path="/var/lib/kubelet/pods/fd6314a3-ad6a-48ea-b54a-a2d1415b287e/volumes" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331195 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-5nmqm"] Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.331456 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerName="extract-utilities" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331467 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerName="extract-utilities" Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.331482 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerName="extract-content" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331488 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerName="extract-content" Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.331496 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerName="registry-server" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331502 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerName="registry-server" Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.331511 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="extract-content" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331518 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="extract-content" Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.331528 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="registry-server" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331534 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="registry-server" Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.331551 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="extract-utilities" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331556 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="extract-utilities" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331690 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="registry-server" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331707 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerName="registry-server" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.332238 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.336697 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.340721 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.340805 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.342017 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-6h2rk" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.342176 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.353728 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.365354 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-5nmqm"] Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.437031 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-49w6l"] Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.445197 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.466165 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-49w6l"] Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.478711 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-token\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.478756 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-trusted-ca\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.478782 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-datadir\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.478837 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-sa-token\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.478872 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-tmp\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.478924 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.478958 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.479002 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config-openshift-service-cacrt\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.479041 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.479062 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmldq\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-kube-api-access-rmldq\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.479102 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-entrypoint\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.511247 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-5nmqm"] Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.511791 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-rmldq metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-5nmqm" podUID="45b41ab9-a5cd-41ec-8714-9d13c0ca0550" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.580398 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-tmp\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.580497 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.580667 4898 secret.go:188] Couldn't get secret openshift-logging/collector-syslog-receiver: secret "collector-syslog-receiver" not found Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.580739 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver podName:45b41ab9-a5cd-41ec-8714-9d13c0ca0550 nodeName:}" failed. No retries permitted until 2026-03-13 14:12:56.080717577 +0000 UTC m=+1011.082305816 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "collector-syslog-receiver" (UniqueName: "kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver") pod "collector-5nmqm" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550") : secret "collector-syslog-receiver" not found Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.580760 4898 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.580854 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics podName:45b41ab9-a5cd-41ec-8714-9d13c0ca0550 nodeName:}" failed. No retries permitted until 2026-03-13 14:12:56.08081698 +0000 UTC m=+1011.082405219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics") pod "collector-5nmqm" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550") : secret "collector-metrics" not found Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.581034 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.581932 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config-openshift-service-cacrt\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.581999 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config-openshift-service-cacrt\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582096 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdrmk\" (UniqueName: \"kubernetes.io/projected/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-kube-api-access-xdrmk\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582163 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582343 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmldq\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-kube-api-access-rmldq\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582376 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-entrypoint\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582404 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-token\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582435 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-utilities\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582464 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-trusted-ca\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582496 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-datadir\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582522 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-catalog-content\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582572 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-datadir\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582645 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-sa-token\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.583239 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-entrypoint\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.583968 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.584135 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-trusted-ca\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.585936 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-tmp\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.587361 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-token\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.601698 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-sa-token\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.605885 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmldq\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-kube-api-access-rmldq\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.684442 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdrmk\" (UniqueName: \"kubernetes.io/projected/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-kube-api-access-xdrmk\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.684527 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-utilities\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.684568 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-catalog-content\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.685218 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-utilities\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.685270 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-catalog-content\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.701542 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdrmk\" (UniqueName: \"kubernetes.io/projected/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-kube-api-access-xdrmk\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.778340 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.097735 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.098176 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.102515 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.102537 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.279054 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-49w6l"] Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.444425 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-5nmqm" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.445013 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49w6l" event={"ID":"dabf24b2-a9e2-4f67-91fd-1625e8ab3196","Type":"ContainerStarted","Data":"eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46"} Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.445068 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49w6l" event={"ID":"dabf24b2-a9e2-4f67-91fd-1625e8ab3196","Type":"ContainerStarted","Data":"cfd1bebb0f1b2036070fc4a91966055ef2a5218eac13ce03c1dd361aa46a7a97"} Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.509471 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-5nmqm" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604506 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-tmp\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604559 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-entrypoint\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604624 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-sa-token\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604651 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-datadir\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604679 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config-openshift-service-cacrt\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604705 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmldq\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-kube-api-access-rmldq\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604751 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604782 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604825 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604925 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-trusted-ca\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604950 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-token\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.605582 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-datadir" (OuterVolumeSpecName: "datadir") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.605747 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.606329 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.606936 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config" (OuterVolumeSpecName: "config") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.607630 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.612582 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-token" (OuterVolumeSpecName: "collector-token") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.612625 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics" (OuterVolumeSpecName: "metrics") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.612646 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-tmp" (OuterVolumeSpecName: "tmp") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.612693 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-sa-token" (OuterVolumeSpecName: "sa-token") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.612892 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-kube-api-access-rmldq" (OuterVolumeSpecName: "kube-api-access-rmldq") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "kube-api-access-rmldq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.613338 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706420 4898 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-datadir\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706452 4898 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706462 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmldq\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-kube-api-access-rmldq\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706471 4898 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706479 4898 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706486 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706495 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706505 4898 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-token\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706513 4898 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-tmp\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706521 4898 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-entrypoint\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706528 4898 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.452988 4898 generic.go:334] "Generic (PLEG): container finished" podID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerID="eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46" exitCode=0 Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.453063 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-5nmqm" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.453037 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49w6l" event={"ID":"dabf24b2-a9e2-4f67-91fd-1625e8ab3196","Type":"ContainerDied","Data":"eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46"} Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.540800 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-5nmqm"] Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.551026 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-5nmqm"] Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.556312 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-xcq52"] Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.557637 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.562631 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.562651 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.562717 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-6h2rk" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.562835 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.563162 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.564617 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-xcq52"] Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.573398 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.730814 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/824d10e9-5cdc-4dc5-b9a8-b151c779b900-tmp\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.730944 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-metrics\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731000 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-collector-token\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731075 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-config-openshift-service-cacrt\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731197 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm26g\" (UniqueName: \"kubernetes.io/projected/824d10e9-5cdc-4dc5-b9a8-b151c779b900-kube-api-access-rm26g\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731296 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/824d10e9-5cdc-4dc5-b9a8-b151c779b900-datadir\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731339 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-config\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731401 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-entrypoint\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731457 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/824d10e9-5cdc-4dc5-b9a8-b151c779b900-sa-token\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731516 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-trusted-ca\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731693 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-collector-syslog-receiver\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.750094 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45b41ab9-a5cd-41ec-8714-9d13c0ca0550" path="/var/lib/kubelet/pods/45b41ab9-a5cd-41ec-8714-9d13c0ca0550/volumes" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833326 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-entrypoint\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833373 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/824d10e9-5cdc-4dc5-b9a8-b151c779b900-sa-token\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833401 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-trusted-ca\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833442 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-collector-syslog-receiver\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833504 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/824d10e9-5cdc-4dc5-b9a8-b151c779b900-tmp\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833520 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-metrics\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833539 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-collector-token\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833554 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-config-openshift-service-cacrt\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833583 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm26g\" (UniqueName: \"kubernetes.io/projected/824d10e9-5cdc-4dc5-b9a8-b151c779b900-kube-api-access-rm26g\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833600 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/824d10e9-5cdc-4dc5-b9a8-b151c779b900-datadir\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833615 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-config\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833849 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/824d10e9-5cdc-4dc5-b9a8-b151c779b900-datadir\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.834717 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-config\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.834841 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-trusted-ca\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.835118 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-config-openshift-service-cacrt\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.835406 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-entrypoint\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.840858 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-metrics\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.841174 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-collector-syslog-receiver\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.841257 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/824d10e9-5cdc-4dc5-b9a8-b151c779b900-tmp\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.842716 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-collector-token\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.853563 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/824d10e9-5cdc-4dc5-b9a8-b151c779b900-sa-token\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.856578 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm26g\" (UniqueName: \"kubernetes.io/projected/824d10e9-5cdc-4dc5-b9a8-b151c779b900-kube-api-access-rm26g\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.880628 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-xcq52" Mar 13 14:12:58 crc kubenswrapper[4898]: I0313 14:12:58.341400 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-xcq52"] Mar 13 14:12:58 crc kubenswrapper[4898]: W0313 14:12:58.348463 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod824d10e9_5cdc_4dc5_b9a8_b151c779b900.slice/crio-f807e9ba698af05d63ce18a3e24d716918f9ef4a1af45c0c0ecafbc443a73586 WatchSource:0}: Error finding container f807e9ba698af05d63ce18a3e24d716918f9ef4a1af45c0c0ecafbc443a73586: Status 404 returned error can't find the container with id f807e9ba698af05d63ce18a3e24d716918f9ef4a1af45c0c0ecafbc443a73586 Mar 13 14:12:58 crc kubenswrapper[4898]: I0313 14:12:58.464399 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-xcq52" event={"ID":"824d10e9-5cdc-4dc5-b9a8-b151c779b900","Type":"ContainerStarted","Data":"f807e9ba698af05d63ce18a3e24d716918f9ef4a1af45c0c0ecafbc443a73586"} Mar 13 14:12:58 crc kubenswrapper[4898]: I0313 14:12:58.467014 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49w6l" event={"ID":"dabf24b2-a9e2-4f67-91fd-1625e8ab3196","Type":"ContainerStarted","Data":"92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477"} Mar 13 14:12:59 crc kubenswrapper[4898]: I0313 14:12:59.478274 4898 generic.go:334] "Generic (PLEG): container finished" podID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerID="92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477" exitCode=0 Mar 13 14:12:59 crc kubenswrapper[4898]: I0313 14:12:59.478317 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49w6l" event={"ID":"dabf24b2-a9e2-4f67-91fd-1625e8ab3196","Type":"ContainerDied","Data":"92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477"} Mar 13 14:13:02 crc kubenswrapper[4898]: I0313 14:13:02.510476 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-xcq52" event={"ID":"824d10e9-5cdc-4dc5-b9a8-b151c779b900","Type":"ContainerStarted","Data":"adc0a5c829ef5909ddc3032fabfbdbc6824e25e07fdf12185f17c84a2adb5373"} Mar 13 14:13:02 crc kubenswrapper[4898]: I0313 14:13:02.550527 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-xcq52" podStartSLOduration=2.063796941 podStartE2EDuration="5.550495389s" podCreationTimestamp="2026-03-13 14:12:57 +0000 UTC" firstStartedPulling="2026-03-13 14:12:58.351112913 +0000 UTC m=+1013.352701162" lastFinishedPulling="2026-03-13 14:13:01.837811371 +0000 UTC m=+1016.839399610" observedRunningTime="2026-03-13 14:13:02.532263199 +0000 UTC m=+1017.533851438" watchObservedRunningTime="2026-03-13 14:13:02.550495389 +0000 UTC m=+1017.552083688" Mar 13 14:13:04 crc kubenswrapper[4898]: I0313 14:13:04.528279 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49w6l" event={"ID":"dabf24b2-a9e2-4f67-91fd-1625e8ab3196","Type":"ContainerStarted","Data":"8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e"} Mar 13 14:13:04 crc kubenswrapper[4898]: I0313 14:13:04.566743 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-49w6l" podStartSLOduration=3.048838095 podStartE2EDuration="9.566705849s" podCreationTimestamp="2026-03-13 14:12:55 +0000 UTC" firstStartedPulling="2026-03-13 14:12:57.455393655 +0000 UTC m=+1012.456981934" lastFinishedPulling="2026-03-13 14:13:03.973261439 +0000 UTC m=+1018.974849688" observedRunningTime="2026-03-13 14:13:04.556830435 +0000 UTC m=+1019.558418714" watchObservedRunningTime="2026-03-13 14:13:04.566705849 +0000 UTC m=+1019.568294128" Mar 13 14:13:05 crc kubenswrapper[4898]: I0313 14:13:05.778857 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:13:05 crc kubenswrapper[4898]: I0313 14:13:05.779330 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:13:06 crc kubenswrapper[4898]: I0313 14:13:06.822975 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-49w6l" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="registry-server" probeResult="failure" output=< Mar 13 14:13:06 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:13:06 crc kubenswrapper[4898]: > Mar 13 14:13:15 crc kubenswrapper[4898]: I0313 14:13:15.835082 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:13:15 crc kubenswrapper[4898]: I0313 14:13:15.894730 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:13:16 crc kubenswrapper[4898]: I0313 14:13:16.069042 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-49w6l"] Mar 13 14:13:17 crc kubenswrapper[4898]: I0313 14:13:17.687768 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-49w6l" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="registry-server" containerID="cri-o://8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e" gracePeriod=2 Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.105783 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.175610 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-catalog-content\") pod \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.175722 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-utilities\") pod \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.175923 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdrmk\" (UniqueName: \"kubernetes.io/projected/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-kube-api-access-xdrmk\") pod \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.177655 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-utilities" (OuterVolumeSpecName: "utilities") pod "dabf24b2-a9e2-4f67-91fd-1625e8ab3196" (UID: "dabf24b2-a9e2-4f67-91fd-1625e8ab3196"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.181687 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-kube-api-access-xdrmk" (OuterVolumeSpecName: "kube-api-access-xdrmk") pod "dabf24b2-a9e2-4f67-91fd-1625e8ab3196" (UID: "dabf24b2-a9e2-4f67-91fd-1625e8ab3196"). InnerVolumeSpecName "kube-api-access-xdrmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.233218 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dabf24b2-a9e2-4f67-91fd-1625e8ab3196" (UID: "dabf24b2-a9e2-4f67-91fd-1625e8ab3196"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.277227 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.277270 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdrmk\" (UniqueName: \"kubernetes.io/projected/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-kube-api-access-xdrmk\") on node \"crc\" DevicePath \"\"" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.277280 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.699744 4898 generic.go:334] "Generic (PLEG): container finished" podID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerID="8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e" exitCode=0 Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.699833 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.699818 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49w6l" event={"ID":"dabf24b2-a9e2-4f67-91fd-1625e8ab3196","Type":"ContainerDied","Data":"8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e"} Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.700261 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49w6l" event={"ID":"dabf24b2-a9e2-4f67-91fd-1625e8ab3196","Type":"ContainerDied","Data":"cfd1bebb0f1b2036070fc4a91966055ef2a5218eac13ce03c1dd361aa46a7a97"} Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.700300 4898 scope.go:117] "RemoveContainer" containerID="8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.739538 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-49w6l"] Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.740813 4898 scope.go:117] "RemoveContainer" containerID="92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.747830 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-49w6l"] Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.761250 4898 scope.go:117] "RemoveContainer" containerID="eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.783044 4898 scope.go:117] "RemoveContainer" containerID="8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e" Mar 13 14:13:18 crc kubenswrapper[4898]: E0313 14:13:18.783662 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e\": container with ID starting with 8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e not found: ID does not exist" containerID="8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.783717 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e"} err="failed to get container status \"8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e\": rpc error: code = NotFound desc = could not find container \"8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e\": container with ID starting with 8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e not found: ID does not exist" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.783743 4898 scope.go:117] "RemoveContainer" containerID="92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477" Mar 13 14:13:18 crc kubenswrapper[4898]: E0313 14:13:18.784079 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477\": container with ID starting with 92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477 not found: ID does not exist" containerID="92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.784100 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477"} err="failed to get container status \"92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477\": rpc error: code = NotFound desc = could not find container \"92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477\": container with ID starting with 92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477 not found: ID does not exist" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.784150 4898 scope.go:117] "RemoveContainer" containerID="eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46" Mar 13 14:13:18 crc kubenswrapper[4898]: E0313 14:13:18.784389 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46\": container with ID starting with eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46 not found: ID does not exist" containerID="eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.784420 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46"} err="failed to get container status \"eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46\": rpc error: code = NotFound desc = could not find container \"eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46\": container with ID starting with eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46 not found: ID does not exist" Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.134731 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.134831 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.134946 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.136472 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b58828da596890620679d1e69bfdfd0b7cd0cf06254eed4031e215964351d8c6"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.136613 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://b58828da596890620679d1e69bfdfd0b7cd0cf06254eed4031e215964351d8c6" gracePeriod=600 Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.709856 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="b58828da596890620679d1e69bfdfd0b7cd0cf06254eed4031e215964351d8c6" exitCode=0 Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.709944 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"b58828da596890620679d1e69bfdfd0b7cd0cf06254eed4031e215964351d8c6"} Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.710712 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"7b5d3972dfd92a1b971338153ac5467cf67b2057ca35cfb382b56be42ddca2ed"} Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.710796 4898 scope.go:117] "RemoveContainer" containerID="5a348cbe99f8e01e53545f65e722853afafc6c3cafe54ec4136fd0f288299e87" Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.753868 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" path="/var/lib/kubelet/pods/dabf24b2-a9e2-4f67-91fd-1625e8ab3196/volumes" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.219078 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh"] Mar 13 14:13:36 crc kubenswrapper[4898]: E0313 14:13:36.219956 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="extract-content" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.219971 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="extract-content" Mar 13 14:13:36 crc kubenswrapper[4898]: E0313 14:13:36.219986 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="registry-server" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.219995 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="registry-server" Mar 13 14:13:36 crc kubenswrapper[4898]: E0313 14:13:36.220005 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="extract-utilities" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.220014 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="extract-utilities" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.220171 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="registry-server" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.221406 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.223997 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.230676 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh"] Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.280230 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrwdv\" (UniqueName: \"kubernetes.io/projected/53800f20-93f5-4ab5-9feb-eb325fa0f945-kube-api-access-zrwdv\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.280293 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.280320 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.382040 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrwdv\" (UniqueName: \"kubernetes.io/projected/53800f20-93f5-4ab5-9feb-eb325fa0f945-kube-api-access-zrwdv\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.382313 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.382346 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.383058 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.383840 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.404862 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrwdv\" (UniqueName: \"kubernetes.io/projected/53800f20-93f5-4ab5-9feb-eb325fa0f945-kube-api-access-zrwdv\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.543600 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:37 crc kubenswrapper[4898]: I0313 14:13:37.033856 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh"] Mar 13 14:13:37 crc kubenswrapper[4898]: I0313 14:13:37.860994 4898 generic.go:334] "Generic (PLEG): container finished" podID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerID="422f90857314e3fc7868e74b78a746d6e5e6560e9451136c0b0897c2dd1d6ab7" exitCode=0 Mar 13 14:13:37 crc kubenswrapper[4898]: I0313 14:13:37.861054 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" event={"ID":"53800f20-93f5-4ab5-9feb-eb325fa0f945","Type":"ContainerDied","Data":"422f90857314e3fc7868e74b78a746d6e5e6560e9451136c0b0897c2dd1d6ab7"} Mar 13 14:13:37 crc kubenswrapper[4898]: I0313 14:13:37.861089 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" event={"ID":"53800f20-93f5-4ab5-9feb-eb325fa0f945","Type":"ContainerStarted","Data":"527fe8e97992861f8440939a2bfd5b861f20af1baa416c51694239c5c3dd1778"} Mar 13 14:13:39 crc kubenswrapper[4898]: I0313 14:13:39.875477 4898 generic.go:334] "Generic (PLEG): container finished" podID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerID="002e8c3dd41fac16b9a3c92a467e8e12ac12ce2cbb5f7c5ab5b3b50c26ff7e4a" exitCode=0 Mar 13 14:13:39 crc kubenswrapper[4898]: I0313 14:13:39.875554 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" event={"ID":"53800f20-93f5-4ab5-9feb-eb325fa0f945","Type":"ContainerDied","Data":"002e8c3dd41fac16b9a3c92a467e8e12ac12ce2cbb5f7c5ab5b3b50c26ff7e4a"} Mar 13 14:13:40 crc kubenswrapper[4898]: I0313 14:13:40.884797 4898 generic.go:334] "Generic (PLEG): container finished" podID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerID="2d43404568d4928eb590843474ccba68a6b5932b3bb07dac9c8d1ba42b0f996e" exitCode=0 Mar 13 14:13:40 crc kubenswrapper[4898]: I0313 14:13:40.884841 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" event={"ID":"53800f20-93f5-4ab5-9feb-eb325fa0f945","Type":"ContainerDied","Data":"2d43404568d4928eb590843474ccba68a6b5932b3bb07dac9c8d1ba42b0f996e"} Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.211178 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.392597 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-util\") pod \"53800f20-93f5-4ab5-9feb-eb325fa0f945\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.392718 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrwdv\" (UniqueName: \"kubernetes.io/projected/53800f20-93f5-4ab5-9feb-eb325fa0f945-kube-api-access-zrwdv\") pod \"53800f20-93f5-4ab5-9feb-eb325fa0f945\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.392812 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-bundle\") pod \"53800f20-93f5-4ab5-9feb-eb325fa0f945\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.393359 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-bundle" (OuterVolumeSpecName: "bundle") pod "53800f20-93f5-4ab5-9feb-eb325fa0f945" (UID: "53800f20-93f5-4ab5-9feb-eb325fa0f945"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.398333 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53800f20-93f5-4ab5-9feb-eb325fa0f945-kube-api-access-zrwdv" (OuterVolumeSpecName: "kube-api-access-zrwdv") pod "53800f20-93f5-4ab5-9feb-eb325fa0f945" (UID: "53800f20-93f5-4ab5-9feb-eb325fa0f945"). InnerVolumeSpecName "kube-api-access-zrwdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.406249 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-util" (OuterVolumeSpecName: "util") pod "53800f20-93f5-4ab5-9feb-eb325fa0f945" (UID: "53800f20-93f5-4ab5-9feb-eb325fa0f945"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.494439 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.494679 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-util\") on node \"crc\" DevicePath \"\"" Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.494691 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrwdv\" (UniqueName: \"kubernetes.io/projected/53800f20-93f5-4ab5-9feb-eb325fa0f945-kube-api-access-zrwdv\") on node \"crc\" DevicePath \"\"" Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.900487 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" event={"ID":"53800f20-93f5-4ab5-9feb-eb325fa0f945","Type":"ContainerDied","Data":"527fe8e97992861f8440939a2bfd5b861f20af1baa416c51694239c5c3dd1778"} Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.900535 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="527fe8e97992861f8440939a2bfd5b861f20af1baa416c51694239c5c3dd1778" Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.900609 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.343730 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2"] Mar 13 14:13:45 crc kubenswrapper[4898]: E0313 14:13:45.344017 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerName="extract" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.344029 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerName="extract" Mar 13 14:13:45 crc kubenswrapper[4898]: E0313 14:13:45.344036 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerName="pull" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.344043 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerName="pull" Mar 13 14:13:45 crc kubenswrapper[4898]: E0313 14:13:45.344071 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerName="util" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.344078 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerName="util" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.344222 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerName="extract" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.344814 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.347619 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.347618 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.348435 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-b8t8b" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.360978 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2"] Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.442001 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg64p\" (UniqueName: \"kubernetes.io/projected/84d4e279-f74c-48fd-9514-1a697341ac6a-kube-api-access-mg64p\") pod \"nmstate-operator-796d4cfff4-hmwt2\" (UID: \"84d4e279-f74c-48fd-9514-1a697341ac6a\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.543457 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg64p\" (UniqueName: \"kubernetes.io/projected/84d4e279-f74c-48fd-9514-1a697341ac6a-kube-api-access-mg64p\") pod \"nmstate-operator-796d4cfff4-hmwt2\" (UID: \"84d4e279-f74c-48fd-9514-1a697341ac6a\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.560451 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg64p\" (UniqueName: \"kubernetes.io/projected/84d4e279-f74c-48fd-9514-1a697341ac6a-kube-api-access-mg64p\") pod \"nmstate-operator-796d4cfff4-hmwt2\" (UID: \"84d4e279-f74c-48fd-9514-1a697341ac6a\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.691070 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2" Mar 13 14:13:46 crc kubenswrapper[4898]: I0313 14:13:46.267795 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2"] Mar 13 14:13:46 crc kubenswrapper[4898]: W0313 14:13:46.271412 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84d4e279_f74c_48fd_9514_1a697341ac6a.slice/crio-a7db08a5effaf4f91c8d554abc0403da376e8417dc4b0c1f63250f72d50e88d5 WatchSource:0}: Error finding container a7db08a5effaf4f91c8d554abc0403da376e8417dc4b0c1f63250f72d50e88d5: Status 404 returned error can't find the container with id a7db08a5effaf4f91c8d554abc0403da376e8417dc4b0c1f63250f72d50e88d5 Mar 13 14:13:46 crc kubenswrapper[4898]: I0313 14:13:46.930248 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2" event={"ID":"84d4e279-f74c-48fd-9514-1a697341ac6a","Type":"ContainerStarted","Data":"a7db08a5effaf4f91c8d554abc0403da376e8417dc4b0c1f63250f72d50e88d5"} Mar 13 14:13:49 crc kubenswrapper[4898]: I0313 14:13:49.955616 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2" event={"ID":"84d4e279-f74c-48fd-9514-1a697341ac6a","Type":"ContainerStarted","Data":"ffab93626eb911b918b7ea9fec209fc64c303e83994c82fcd6ec8b826f9cc21f"} Mar 13 14:13:49 crc kubenswrapper[4898]: I0313 14:13:49.980377 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2" podStartSLOduration=1.966667137 podStartE2EDuration="4.9803488s" podCreationTimestamp="2026-03-13 14:13:45 +0000 UTC" firstStartedPulling="2026-03-13 14:13:46.274486832 +0000 UTC m=+1061.276075071" lastFinishedPulling="2026-03-13 14:13:49.288168495 +0000 UTC m=+1064.289756734" observedRunningTime="2026-03-13 14:13:49.97196283 +0000 UTC m=+1064.973551139" watchObservedRunningTime="2026-03-13 14:13:49.9803488 +0000 UTC m=+1064.981937049" Mar 13 14:13:50 crc kubenswrapper[4898]: I0313 14:13:50.975148 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:50.976317 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:50.985206 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:50.986943 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:50.987616 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-zbnmq" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:50.987622 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.002650 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.012565 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.012595 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-fpgr7"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.013383 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.123686 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tqk2\" (UniqueName: \"kubernetes.io/projected/a9193e72-6911-4df4-8b26-04b2537f68a9-kube-api-access-5tqk2\") pod \"nmstate-webhook-5f558f5558-m8j8d\" (UID: \"a9193e72-6911-4df4-8b26-04b2537f68a9\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.123763 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-ovs-socket\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.123797 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zwrb\" (UniqueName: \"kubernetes.io/projected/e4761153-ed4e-4264-8f21-b4de31a4bbb8-kube-api-access-5zwrb\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.123864 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfp9h\" (UniqueName: \"kubernetes.io/projected/35105fc0-dff0-4480-8635-cbbeec82d124-kube-api-access-hfp9h\") pod \"nmstate-metrics-9b8c8685d-c8fgd\" (UID: \"35105fc0-dff0-4480-8635-cbbeec82d124\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.124086 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-nmstate-lock\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.124115 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-dbus-socket\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.124159 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a9193e72-6911-4df4-8b26-04b2537f68a9-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-m8j8d\" (UID: \"a9193e72-6911-4df4-8b26-04b2537f68a9\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.147445 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.148549 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.157374 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.157630 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-49nnv" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.157696 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.168425 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.225581 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-ovs-socket\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.225646 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zwrb\" (UniqueName: \"kubernetes.io/projected/e4761153-ed4e-4264-8f21-b4de31a4bbb8-kube-api-access-5zwrb\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.225717 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfp9h\" (UniqueName: \"kubernetes.io/projected/35105fc0-dff0-4480-8635-cbbeec82d124-kube-api-access-hfp9h\") pod \"nmstate-metrics-9b8c8685d-c8fgd\" (UID: \"35105fc0-dff0-4480-8635-cbbeec82d124\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.225748 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-nmstate-lock\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.225780 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-dbus-socket\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.225825 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a9193e72-6911-4df4-8b26-04b2537f68a9-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-m8j8d\" (UID: \"a9193e72-6911-4df4-8b26-04b2537f68a9\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.225886 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tqk2\" (UniqueName: \"kubernetes.io/projected/a9193e72-6911-4df4-8b26-04b2537f68a9-kube-api-access-5tqk2\") pod \"nmstate-webhook-5f558f5558-m8j8d\" (UID: \"a9193e72-6911-4df4-8b26-04b2537f68a9\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.226466 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-ovs-socket\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.226812 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-nmstate-lock\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.227099 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-dbus-socket\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.234952 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a9193e72-6911-4df4-8b26-04b2537f68a9-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-m8j8d\" (UID: \"a9193e72-6911-4df4-8b26-04b2537f68a9\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.247107 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zwrb\" (UniqueName: \"kubernetes.io/projected/e4761153-ed4e-4264-8f21-b4de31a4bbb8-kube-api-access-5zwrb\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.249093 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tqk2\" (UniqueName: \"kubernetes.io/projected/a9193e72-6911-4df4-8b26-04b2537f68a9-kube-api-access-5tqk2\") pod \"nmstate-webhook-5f558f5558-m8j8d\" (UID: \"a9193e72-6911-4df4-8b26-04b2537f68a9\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.277636 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfp9h\" (UniqueName: \"kubernetes.io/projected/35105fc0-dff0-4480-8635-cbbeec82d124-kube-api-access-hfp9h\") pod \"nmstate-metrics-9b8c8685d-c8fgd\" (UID: \"35105fc0-dff0-4480-8635-cbbeec82d124\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.327563 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b707c4ee-39e1-4fc6-812a-f61e722c1079-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.327681 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjdb9\" (UniqueName: \"kubernetes.io/projected/b707c4ee-39e1-4fc6-812a-f61e722c1079-kube-api-access-pjdb9\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.327704 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b707c4ee-39e1-4fc6-812a-f61e722c1079-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.339288 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.356314 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6ddbb5776b-mx8sz"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.357420 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.359479 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.368223 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.379421 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ddbb5776b-mx8sz"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.429261 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b707c4ee-39e1-4fc6-812a-f61e722c1079-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.429428 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjdb9\" (UniqueName: \"kubernetes.io/projected/b707c4ee-39e1-4fc6-812a-f61e722c1079-kube-api-access-pjdb9\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.429474 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b707c4ee-39e1-4fc6-812a-f61e722c1079-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.430861 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b707c4ee-39e1-4fc6-812a-f61e722c1079-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.433773 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b707c4ee-39e1-4fc6-812a-f61e722c1079-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: W0313 14:13:51.440050 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4761153_ed4e_4264_8f21_b4de31a4bbb8.slice/crio-f3cf77f913a23536b768c77e85d72ab96f7623b941d7622636b49e942dcd6386 WatchSource:0}: Error finding container f3cf77f913a23536b768c77e85d72ab96f7623b941d7622636b49e942dcd6386: Status 404 returned error can't find the container with id f3cf77f913a23536b768c77e85d72ab96f7623b941d7622636b49e942dcd6386 Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.452806 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjdb9\" (UniqueName: \"kubernetes.io/projected/b707c4ee-39e1-4fc6-812a-f61e722c1079-kube-api-access-pjdb9\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.471092 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.530843 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-trusted-ca-bundle\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.530891 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-oauth-serving-cert\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.531096 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98jnr\" (UniqueName: \"kubernetes.io/projected/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-kube-api-access-98jnr\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.531151 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-config\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.531173 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-service-ca\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.531210 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-serving-cert\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.531262 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-oauth-config\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.633016 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98jnr\" (UniqueName: \"kubernetes.io/projected/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-kube-api-access-98jnr\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.633112 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-config\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.633139 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-service-ca\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.633187 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-serving-cert\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.633250 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-oauth-config\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.633284 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-trusted-ca-bundle\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.633305 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-oauth-serving-cert\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.634428 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-service-ca\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.634727 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-config\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.635220 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-oauth-serving-cert\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.635245 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-trusted-ca-bundle\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.649226 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-serving-cert\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.649412 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-oauth-config\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.652343 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98jnr\" (UniqueName: \"kubernetes.io/projected/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-kube-api-access-98jnr\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.733787 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.879067 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d"] Mar 13 14:13:51 crc kubenswrapper[4898]: W0313 14:13:51.889148 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9193e72_6911_4df4_8b26_04b2537f68a9.slice/crio-c11c0b93f3e43194c0a8f9c279c5835e0bb1a5fbe1e76e42b208c898761e9079 WatchSource:0}: Error finding container c11c0b93f3e43194c0a8f9c279c5835e0bb1a5fbe1e76e42b208c898761e9079: Status 404 returned error can't find the container with id c11c0b93f3e43194c0a8f9c279c5835e0bb1a5fbe1e76e42b208c898761e9079 Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.926574 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.980187 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" event={"ID":"a9193e72-6911-4df4-8b26-04b2537f68a9","Type":"ContainerStarted","Data":"c11c0b93f3e43194c0a8f9c279c5835e0bb1a5fbe1e76e42b208c898761e9079"} Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.982370 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fpgr7" event={"ID":"e4761153-ed4e-4264-8f21-b4de31a4bbb8","Type":"ContainerStarted","Data":"f3cf77f913a23536b768c77e85d72ab96f7623b941d7622636b49e942dcd6386"} Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.983432 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" event={"ID":"35105fc0-dff0-4480-8635-cbbeec82d124","Type":"ContainerStarted","Data":"d8eb96047c72aec16ebf70fe77d6b642481a3aa674e6c39193660ebe147346e9"} Mar 13 14:13:52 crc kubenswrapper[4898]: I0313 14:13:52.013824 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6"] Mar 13 14:13:52 crc kubenswrapper[4898]: I0313 14:13:52.180610 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ddbb5776b-mx8sz"] Mar 13 14:13:52 crc kubenswrapper[4898]: W0313 14:13:52.184514 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f6fd2de_efa6_4d17_aa5e_4f44ced1f822.slice/crio-25f0f90d86df62bf31d201c4dfd7dca1ef0e3998bd4fd076756d1a9449231afa WatchSource:0}: Error finding container 25f0f90d86df62bf31d201c4dfd7dca1ef0e3998bd4fd076756d1a9449231afa: Status 404 returned error can't find the container with id 25f0f90d86df62bf31d201c4dfd7dca1ef0e3998bd4fd076756d1a9449231afa Mar 13 14:13:52 crc kubenswrapper[4898]: I0313 14:13:52.994938 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" event={"ID":"b707c4ee-39e1-4fc6-812a-f61e722c1079","Type":"ContainerStarted","Data":"4f5b664ce3a49a4a9347e418a7f4e569dd9d364a61493834d164b5cf58547792"} Mar 13 14:13:52 crc kubenswrapper[4898]: I0313 14:13:52.996219 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ddbb5776b-mx8sz" event={"ID":"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822","Type":"ContainerStarted","Data":"5fefdf0ffc03648b76b936160e7ddb5fac4056b104b304ca9509ff6217a2c4fc"} Mar 13 14:13:52 crc kubenswrapper[4898]: I0313 14:13:52.996261 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ddbb5776b-mx8sz" event={"ID":"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822","Type":"ContainerStarted","Data":"25f0f90d86df62bf31d201c4dfd7dca1ef0e3998bd4fd076756d1a9449231afa"} Mar 13 14:13:53 crc kubenswrapper[4898]: I0313 14:13:53.023424 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6ddbb5776b-mx8sz" podStartSLOduration=2.023407525 podStartE2EDuration="2.023407525s" podCreationTimestamp="2026-03-13 14:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:13:53.021328751 +0000 UTC m=+1068.022917000" watchObservedRunningTime="2026-03-13 14:13:53.023407525 +0000 UTC m=+1068.024995774" Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.049580 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fpgr7" event={"ID":"e4761153-ed4e-4264-8f21-b4de31a4bbb8","Type":"ContainerStarted","Data":"65a143f64e3c36ce5848d2ec35e4a19d110bc74a2f71a90fed40c46bcaecaa29"} Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.049943 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.051348 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" event={"ID":"b707c4ee-39e1-4fc6-812a-f61e722c1079","Type":"ContainerStarted","Data":"67a5158448d0e030c1a30934e7dd23db401ce35e94a196becdb13aa6078a4b98"} Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.053727 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" event={"ID":"35105fc0-dff0-4480-8635-cbbeec82d124","Type":"ContainerStarted","Data":"3e7a7b37583b1e5e3b8f111f7438b8d95b9b9b3c1c267e26122804512d690869"} Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.054887 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" event={"ID":"a9193e72-6911-4df4-8b26-04b2537f68a9","Type":"ContainerStarted","Data":"b3f66f08bca8f7bcd172456023ef18bc5bdca02a16923c5dda15fb815f41cda5"} Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.055069 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.092379 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-fpgr7" podStartSLOduration=2.292144794 podStartE2EDuration="6.092359761s" podCreationTimestamp="2026-03-13 14:13:50 +0000 UTC" firstStartedPulling="2026-03-13 14:13:51.454966881 +0000 UTC m=+1066.456555120" lastFinishedPulling="2026-03-13 14:13:55.255181848 +0000 UTC m=+1070.256770087" observedRunningTime="2026-03-13 14:13:56.087346779 +0000 UTC m=+1071.088935038" watchObservedRunningTime="2026-03-13 14:13:56.092359761 +0000 UTC m=+1071.093948000" Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.121494 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" podStartSLOduration=2.788425762 podStartE2EDuration="6.121467025s" podCreationTimestamp="2026-03-13 14:13:50 +0000 UTC" firstStartedPulling="2026-03-13 14:13:51.89305419 +0000 UTC m=+1066.894642429" lastFinishedPulling="2026-03-13 14:13:55.226095453 +0000 UTC m=+1070.227683692" observedRunningTime="2026-03-13 14:13:56.112183211 +0000 UTC m=+1071.113771450" watchObservedRunningTime="2026-03-13 14:13:56.121467025 +0000 UTC m=+1071.123055274" Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.133935 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" podStartSLOduration=1.92650786 podStartE2EDuration="5.133883152s" podCreationTimestamp="2026-03-13 14:13:51 +0000 UTC" firstStartedPulling="2026-03-13 14:13:52.017812968 +0000 UTC m=+1067.019401207" lastFinishedPulling="2026-03-13 14:13:55.22518826 +0000 UTC m=+1070.226776499" observedRunningTime="2026-03-13 14:13:56.127015451 +0000 UTC m=+1071.128603690" watchObservedRunningTime="2026-03-13 14:13:56.133883152 +0000 UTC m=+1071.135471401" Mar 13 14:13:59 crc kubenswrapper[4898]: I0313 14:13:59.103418 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" event={"ID":"35105fc0-dff0-4480-8635-cbbeec82d124","Type":"ContainerStarted","Data":"b3ef85f675b604ef99b0a9b009e4b0d613504667c6746bbb6830e5276d1293d7"} Mar 13 14:13:59 crc kubenswrapper[4898]: I0313 14:13:59.130582 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" podStartSLOduration=2.72853263 podStartE2EDuration="9.130553609s" podCreationTimestamp="2026-03-13 14:13:50 +0000 UTC" firstStartedPulling="2026-03-13 14:13:51.936743408 +0000 UTC m=+1066.938331647" lastFinishedPulling="2026-03-13 14:13:58.338764387 +0000 UTC m=+1073.340352626" observedRunningTime="2026-03-13 14:13:59.12717982 +0000 UTC m=+1074.128768109" watchObservedRunningTime="2026-03-13 14:13:59.130553609 +0000 UTC m=+1074.132141898" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.139644 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556854-6mxhz"] Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.141266 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.144175 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.144415 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.145632 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.153322 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556854-6mxhz"] Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.305539 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz5zd\" (UniqueName: \"kubernetes.io/projected/35372caa-772c-434c-8fb2-3b82926c1521-kube-api-access-xz5zd\") pod \"auto-csr-approver-29556854-6mxhz\" (UID: \"35372caa-772c-434c-8fb2-3b82926c1521\") " pod="openshift-infra/auto-csr-approver-29556854-6mxhz" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.407501 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz5zd\" (UniqueName: \"kubernetes.io/projected/35372caa-772c-434c-8fb2-3b82926c1521-kube-api-access-xz5zd\") pod \"auto-csr-approver-29556854-6mxhz\" (UID: \"35372caa-772c-434c-8fb2-3b82926c1521\") " pod="openshift-infra/auto-csr-approver-29556854-6mxhz" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.429843 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz5zd\" (UniqueName: \"kubernetes.io/projected/35372caa-772c-434c-8fb2-3b82926c1521-kube-api-access-xz5zd\") pod \"auto-csr-approver-29556854-6mxhz\" (UID: \"35372caa-772c-434c-8fb2-3b82926c1521\") " pod="openshift-infra/auto-csr-approver-29556854-6mxhz" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.464434 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.941758 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556854-6mxhz"] Mar 13 14:14:01 crc kubenswrapper[4898]: I0313 14:14:01.122036 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" event={"ID":"35372caa-772c-434c-8fb2-3b82926c1521","Type":"ContainerStarted","Data":"12a8614ae5db33ffe748f5edebb3d67a71dd9d1797f096a66e2d95044041c69a"} Mar 13 14:14:01 crc kubenswrapper[4898]: I0313 14:14:01.404146 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:14:01 crc kubenswrapper[4898]: I0313 14:14:01.735039 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:14:01 crc kubenswrapper[4898]: I0313 14:14:01.735141 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:14:01 crc kubenswrapper[4898]: I0313 14:14:01.750834 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:14:02 crc kubenswrapper[4898]: I0313 14:14:02.129284 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" event={"ID":"35372caa-772c-434c-8fb2-3b82926c1521","Type":"ContainerStarted","Data":"ad86fe4efa1fa3496cbed8d6aa93dada393bba49f5fe2d8062f7e0508875ea38"} Mar 13 14:14:02 crc kubenswrapper[4898]: I0313 14:14:02.134220 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:14:02 crc kubenswrapper[4898]: I0313 14:14:02.143963 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" podStartSLOduration=1.354426773 podStartE2EDuration="2.143938114s" podCreationTimestamp="2026-03-13 14:14:00 +0000 UTC" firstStartedPulling="2026-03-13 14:14:00.948357255 +0000 UTC m=+1075.949945494" lastFinishedPulling="2026-03-13 14:14:01.737868596 +0000 UTC m=+1076.739456835" observedRunningTime="2026-03-13 14:14:02.14072379 +0000 UTC m=+1077.142312029" watchObservedRunningTime="2026-03-13 14:14:02.143938114 +0000 UTC m=+1077.145526353" Mar 13 14:14:02 crc kubenswrapper[4898]: I0313 14:14:02.207786 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-758c8fb5b-pxts9"] Mar 13 14:14:03 crc kubenswrapper[4898]: I0313 14:14:03.136530 4898 generic.go:334] "Generic (PLEG): container finished" podID="35372caa-772c-434c-8fb2-3b82926c1521" containerID="ad86fe4efa1fa3496cbed8d6aa93dada393bba49f5fe2d8062f7e0508875ea38" exitCode=0 Mar 13 14:14:03 crc kubenswrapper[4898]: I0313 14:14:03.136644 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" event={"ID":"35372caa-772c-434c-8fb2-3b82926c1521","Type":"ContainerDied","Data":"ad86fe4efa1fa3496cbed8d6aa93dada393bba49f5fe2d8062f7e0508875ea38"} Mar 13 14:14:04 crc kubenswrapper[4898]: I0313 14:14:04.481340 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" Mar 13 14:14:04 crc kubenswrapper[4898]: I0313 14:14:04.584396 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz5zd\" (UniqueName: \"kubernetes.io/projected/35372caa-772c-434c-8fb2-3b82926c1521-kube-api-access-xz5zd\") pod \"35372caa-772c-434c-8fb2-3b82926c1521\" (UID: \"35372caa-772c-434c-8fb2-3b82926c1521\") " Mar 13 14:14:04 crc kubenswrapper[4898]: I0313 14:14:04.592891 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35372caa-772c-434c-8fb2-3b82926c1521-kube-api-access-xz5zd" (OuterVolumeSpecName: "kube-api-access-xz5zd") pod "35372caa-772c-434c-8fb2-3b82926c1521" (UID: "35372caa-772c-434c-8fb2-3b82926c1521"). InnerVolumeSpecName "kube-api-access-xz5zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:14:04 crc kubenswrapper[4898]: I0313 14:14:04.686614 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz5zd\" (UniqueName: \"kubernetes.io/projected/35372caa-772c-434c-8fb2-3b82926c1521-kube-api-access-xz5zd\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:05 crc kubenswrapper[4898]: I0313 14:14:05.156669 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" event={"ID":"35372caa-772c-434c-8fb2-3b82926c1521","Type":"ContainerDied","Data":"12a8614ae5db33ffe748f5edebb3d67a71dd9d1797f096a66e2d95044041c69a"} Mar 13 14:14:05 crc kubenswrapper[4898]: I0313 14:14:05.157054 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12a8614ae5db33ffe748f5edebb3d67a71dd9d1797f096a66e2d95044041c69a" Mar 13 14:14:05 crc kubenswrapper[4898]: I0313 14:14:05.156750 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" Mar 13 14:14:05 crc kubenswrapper[4898]: I0313 14:14:05.216487 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556848-wlplx"] Mar 13 14:14:05 crc kubenswrapper[4898]: I0313 14:14:05.225686 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556848-wlplx"] Mar 13 14:14:05 crc kubenswrapper[4898]: I0313 14:14:05.754900 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe4a848e-c06e-4205-a1a6-8b14b620096c" path="/var/lib/kubelet/pods/fe4a848e-c06e-4205-a1a6-8b14b620096c/volumes" Mar 13 14:14:11 crc kubenswrapper[4898]: I0313 14:14:11.347926 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.253133 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-758c8fb5b-pxts9" podUID="571e1a76-1585-4c39-887c-d9c3f735a908" containerName="console" containerID="cri-o://4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d" gracePeriod=15 Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.683607 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-758c8fb5b-pxts9_571e1a76-1585-4c39-887c-d9c3f735a908/console/0.log" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.683668 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.840604 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-console-config\") pod \"571e1a76-1585-4c39-887c-d9c3f735a908\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.840843 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twb6p\" (UniqueName: \"kubernetes.io/projected/571e1a76-1585-4c39-887c-d9c3f735a908-kube-api-access-twb6p\") pod \"571e1a76-1585-4c39-887c-d9c3f735a908\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.840889 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-service-ca\") pod \"571e1a76-1585-4c39-887c-d9c3f735a908\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.840969 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-trusted-ca-bundle\") pod \"571e1a76-1585-4c39-887c-d9c3f735a908\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.840986 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-oauth-serving-cert\") pod \"571e1a76-1585-4c39-887c-d9c3f735a908\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.841007 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-oauth-config\") pod \"571e1a76-1585-4c39-887c-d9c3f735a908\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.841030 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-serving-cert\") pod \"571e1a76-1585-4c39-887c-d9c3f735a908\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.841428 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "571e1a76-1585-4c39-887c-d9c3f735a908" (UID: "571e1a76-1585-4c39-887c-d9c3f735a908"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.841450 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "571e1a76-1585-4c39-887c-d9c3f735a908" (UID: "571e1a76-1585-4c39-887c-d9c3f735a908"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.841489 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-console-config" (OuterVolumeSpecName: "console-config") pod "571e1a76-1585-4c39-887c-d9c3f735a908" (UID: "571e1a76-1585-4c39-887c-d9c3f735a908"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.841693 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-service-ca" (OuterVolumeSpecName: "service-ca") pod "571e1a76-1585-4c39-887c-d9c3f735a908" (UID: "571e1a76-1585-4c39-887c-d9c3f735a908"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.845398 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/571e1a76-1585-4c39-887c-d9c3f735a908-kube-api-access-twb6p" (OuterVolumeSpecName: "kube-api-access-twb6p") pod "571e1a76-1585-4c39-887c-d9c3f735a908" (UID: "571e1a76-1585-4c39-887c-d9c3f735a908"). InnerVolumeSpecName "kube-api-access-twb6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.852079 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "571e1a76-1585-4c39-887c-d9c3f735a908" (UID: "571e1a76-1585-4c39-887c-d9c3f735a908"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.863189 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "571e1a76-1585-4c39-887c-d9c3f735a908" (UID: "571e1a76-1585-4c39-887c-d9c3f735a908"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.945326 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.945365 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.945379 4898 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.945389 4898 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.945401 4898 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.945414 4898 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.945426 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twb6p\" (UniqueName: \"kubernetes.io/projected/571e1a76-1585-4c39-887c-d9c3f735a908-kube-api-access-twb6p\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.367066 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-758c8fb5b-pxts9_571e1a76-1585-4c39-887c-d9c3f735a908/console/0.log" Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.367123 4898 generic.go:334] "Generic (PLEG): container finished" podID="571e1a76-1585-4c39-887c-d9c3f735a908" containerID="4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d" exitCode=2 Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.367166 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-758c8fb5b-pxts9" event={"ID":"571e1a76-1585-4c39-887c-d9c3f735a908","Type":"ContainerDied","Data":"4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d"} Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.367199 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.367225 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-758c8fb5b-pxts9" event={"ID":"571e1a76-1585-4c39-887c-d9c3f735a908","Type":"ContainerDied","Data":"4b09f73c7fa831fe94f3a344d5bf8593ff107c618a4ee0a2a0be061afa612208"} Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.367248 4898 scope.go:117] "RemoveContainer" containerID="4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d" Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.402024 4898 scope.go:117] "RemoveContainer" containerID="4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d" Mar 13 14:14:28 crc kubenswrapper[4898]: E0313 14:14:28.402442 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d\": container with ID starting with 4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d not found: ID does not exist" containerID="4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d" Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.402491 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d"} err="failed to get container status \"4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d\": rpc error: code = NotFound desc = could not find container \"4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d\": container with ID starting with 4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d not found: ID does not exist" Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.414874 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-758c8fb5b-pxts9"] Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.423124 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-758c8fb5b-pxts9"] Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.754103 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="571e1a76-1585-4c39-887c-d9c3f735a908" path="/var/lib/kubelet/pods/571e1a76-1585-4c39-887c-d9c3f735a908/volumes" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.957306 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8"] Mar 13 14:14:29 crc kubenswrapper[4898]: E0313 14:14:29.957675 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571e1a76-1585-4c39-887c-d9c3f735a908" containerName="console" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.957696 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="571e1a76-1585-4c39-887c-d9c3f735a908" containerName="console" Mar 13 14:14:29 crc kubenswrapper[4898]: E0313 14:14:29.957722 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35372caa-772c-434c-8fb2-3b82926c1521" containerName="oc" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.957732 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="35372caa-772c-434c-8fb2-3b82926c1521" containerName="oc" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.957937 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="35372caa-772c-434c-8fb2-3b82926c1521" containerName="oc" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.957964 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="571e1a76-1585-4c39-887c-d9c3f735a908" containerName="console" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.959684 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.963143 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.975457 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8"] Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.976512 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.976593 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrlvt\" (UniqueName: \"kubernetes.io/projected/53fae31e-a97e-443d-88c2-fa38af842855-kube-api-access-jrlvt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.976780 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:30 crc kubenswrapper[4898]: I0313 14:14:30.077809 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:30 crc kubenswrapper[4898]: I0313 14:14:30.077894 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:30 crc kubenswrapper[4898]: I0313 14:14:30.077940 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrlvt\" (UniqueName: \"kubernetes.io/projected/53fae31e-a97e-443d-88c2-fa38af842855-kube-api-access-jrlvt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:30 crc kubenswrapper[4898]: I0313 14:14:30.078556 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:30 crc kubenswrapper[4898]: I0313 14:14:30.078581 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:30 crc kubenswrapper[4898]: I0313 14:14:30.100664 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrlvt\" (UniqueName: \"kubernetes.io/projected/53fae31e-a97e-443d-88c2-fa38af842855-kube-api-access-jrlvt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:30 crc kubenswrapper[4898]: I0313 14:14:30.283830 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:30 crc kubenswrapper[4898]: I0313 14:14:30.739223 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8"] Mar 13 14:14:31 crc kubenswrapper[4898]: I0313 14:14:31.396014 4898 generic.go:334] "Generic (PLEG): container finished" podID="53fae31e-a97e-443d-88c2-fa38af842855" containerID="429d1ad392a73aabf01acc7812c562b1be79f59b798457e6e1a695312a7b362d" exitCode=0 Mar 13 14:14:31 crc kubenswrapper[4898]: I0313 14:14:31.396423 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" event={"ID":"53fae31e-a97e-443d-88c2-fa38af842855","Type":"ContainerDied","Data":"429d1ad392a73aabf01acc7812c562b1be79f59b798457e6e1a695312a7b362d"} Mar 13 14:14:31 crc kubenswrapper[4898]: I0313 14:14:31.396479 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" event={"ID":"53fae31e-a97e-443d-88c2-fa38af842855","Type":"ContainerStarted","Data":"323d3e074b5c24a04b5c3658c6de9f8d795ecd9605eb61941d76763b2c712df3"} Mar 13 14:14:32 crc kubenswrapper[4898]: I0313 14:14:32.632307 4898 scope.go:117] "RemoveContainer" containerID="e5c3875fd4b0ad4fd5d4afba4c88238837f0d8b510bd53eb8f51d4cd510b00e3" Mar 13 14:14:33 crc kubenswrapper[4898]: I0313 14:14:33.413339 4898 generic.go:334] "Generic (PLEG): container finished" podID="53fae31e-a97e-443d-88c2-fa38af842855" containerID="4012c0833e1dcb8cb1a98eb3cbe1c9311d9227c19a981851d7fa70805d402a65" exitCode=0 Mar 13 14:14:33 crc kubenswrapper[4898]: I0313 14:14:33.413391 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" event={"ID":"53fae31e-a97e-443d-88c2-fa38af842855","Type":"ContainerDied","Data":"4012c0833e1dcb8cb1a98eb3cbe1c9311d9227c19a981851d7fa70805d402a65"} Mar 13 14:14:34 crc kubenswrapper[4898]: I0313 14:14:34.427573 4898 generic.go:334] "Generic (PLEG): container finished" podID="53fae31e-a97e-443d-88c2-fa38af842855" containerID="45d423447e381b81b290026e1eb4ed79374436f22937a6fac96020fc29152f5d" exitCode=0 Mar 13 14:14:34 crc kubenswrapper[4898]: I0313 14:14:34.427663 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" event={"ID":"53fae31e-a97e-443d-88c2-fa38af842855","Type":"ContainerDied","Data":"45d423447e381b81b290026e1eb4ed79374436f22937a6fac96020fc29152f5d"} Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.763682 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.881625 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-bundle\") pod \"53fae31e-a97e-443d-88c2-fa38af842855\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.881717 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-util\") pod \"53fae31e-a97e-443d-88c2-fa38af842855\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.881803 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrlvt\" (UniqueName: \"kubernetes.io/projected/53fae31e-a97e-443d-88c2-fa38af842855-kube-api-access-jrlvt\") pod \"53fae31e-a97e-443d-88c2-fa38af842855\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.883088 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-bundle" (OuterVolumeSpecName: "bundle") pod "53fae31e-a97e-443d-88c2-fa38af842855" (UID: "53fae31e-a97e-443d-88c2-fa38af842855"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.887404 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53fae31e-a97e-443d-88c2-fa38af842855-kube-api-access-jrlvt" (OuterVolumeSpecName: "kube-api-access-jrlvt") pod "53fae31e-a97e-443d-88c2-fa38af842855" (UID: "53fae31e-a97e-443d-88c2-fa38af842855"). InnerVolumeSpecName "kube-api-access-jrlvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.897027 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-util" (OuterVolumeSpecName: "util") pod "53fae31e-a97e-443d-88c2-fa38af842855" (UID: "53fae31e-a97e-443d-88c2-fa38af842855"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.983821 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.983862 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-util\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.983876 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrlvt\" (UniqueName: \"kubernetes.io/projected/53fae31e-a97e-443d-88c2-fa38af842855-kube-api-access-jrlvt\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:36 crc kubenswrapper[4898]: I0313 14:14:36.452332 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" event={"ID":"53fae31e-a97e-443d-88c2-fa38af842855","Type":"ContainerDied","Data":"323d3e074b5c24a04b5c3658c6de9f8d795ecd9605eb61941d76763b2c712df3"} Mar 13 14:14:36 crc kubenswrapper[4898]: I0313 14:14:36.452403 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="323d3e074b5c24a04b5c3658c6de9f8d795ecd9605eb61941d76763b2c712df3" Mar 13 14:14:36 crc kubenswrapper[4898]: I0313 14:14:36.452415 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.477302 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx"] Mar 13 14:14:44 crc kubenswrapper[4898]: E0313 14:14:44.478200 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53fae31e-a97e-443d-88c2-fa38af842855" containerName="util" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.478216 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="53fae31e-a97e-443d-88c2-fa38af842855" containerName="util" Mar 13 14:14:44 crc kubenswrapper[4898]: E0313 14:14:44.478234 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53fae31e-a97e-443d-88c2-fa38af842855" containerName="pull" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.478241 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="53fae31e-a97e-443d-88c2-fa38af842855" containerName="pull" Mar 13 14:14:44 crc kubenswrapper[4898]: E0313 14:14:44.478256 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53fae31e-a97e-443d-88c2-fa38af842855" containerName="extract" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.478266 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="53fae31e-a97e-443d-88c2-fa38af842855" containerName="extract" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.478450 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="53fae31e-a97e-443d-88c2-fa38af842855" containerName="extract" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.479118 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.481800 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.482045 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.482167 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.482715 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-6d6mt" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.485802 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.495419 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx"] Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.617703 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e000d86e-e7a8-49ed-9184-fdd67dfe797d-apiservice-cert\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.618092 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e000d86e-e7a8-49ed-9184-fdd67dfe797d-webhook-cert\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.618171 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztz9s\" (UniqueName: \"kubernetes.io/projected/e000d86e-e7a8-49ed-9184-fdd67dfe797d-kube-api-access-ztz9s\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.720120 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e000d86e-e7a8-49ed-9184-fdd67dfe797d-apiservice-cert\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.720194 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e000d86e-e7a8-49ed-9184-fdd67dfe797d-webhook-cert\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.720251 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztz9s\" (UniqueName: \"kubernetes.io/projected/e000d86e-e7a8-49ed-9184-fdd67dfe797d-kube-api-access-ztz9s\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.726157 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e000d86e-e7a8-49ed-9184-fdd67dfe797d-apiservice-cert\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.726183 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e000d86e-e7a8-49ed-9184-fdd67dfe797d-webhook-cert\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.756675 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztz9s\" (UniqueName: \"kubernetes.io/projected/e000d86e-e7a8-49ed-9184-fdd67dfe797d-kube-api-access-ztz9s\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.769409 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw"] Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.770529 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.776406 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.777330 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.777506 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-7mmlg" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.790151 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw"] Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.801496 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.821856 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-apiservice-cert\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.822014 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgrzc\" (UniqueName: \"kubernetes.io/projected/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-kube-api-access-tgrzc\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.822111 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-webhook-cert\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.923172 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgrzc\" (UniqueName: \"kubernetes.io/projected/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-kube-api-access-tgrzc\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.923306 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-webhook-cert\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.923339 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-apiservice-cert\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.930618 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-apiservice-cert\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.944010 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-webhook-cert\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.949803 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgrzc\" (UniqueName: \"kubernetes.io/projected/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-kube-api-access-tgrzc\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:45 crc kubenswrapper[4898]: I0313 14:14:45.161663 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:45 crc kubenswrapper[4898]: I0313 14:14:45.325053 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx"] Mar 13 14:14:45 crc kubenswrapper[4898]: I0313 14:14:45.567442 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" event={"ID":"e000d86e-e7a8-49ed-9184-fdd67dfe797d","Type":"ContainerStarted","Data":"df1d17e4129d6fd4edf7571f204b641a8c16e11962457e0be5178efcce112d85"} Mar 13 14:14:45 crc kubenswrapper[4898]: I0313 14:14:45.650593 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw"] Mar 13 14:14:45 crc kubenswrapper[4898]: W0313 14:14:45.659164 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34b4f98c_a87c_4a97_9ac4_286afeb9e4bc.slice/crio-16db7a2ad5ebef1cec02a7e225cd353238ab67016ad829119d268c65e8e9f3f0 WatchSource:0}: Error finding container 16db7a2ad5ebef1cec02a7e225cd353238ab67016ad829119d268c65e8e9f3f0: Status 404 returned error can't find the container with id 16db7a2ad5ebef1cec02a7e225cd353238ab67016ad829119d268c65e8e9f3f0 Mar 13 14:14:46 crc kubenswrapper[4898]: I0313 14:14:46.578767 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" event={"ID":"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc","Type":"ContainerStarted","Data":"16db7a2ad5ebef1cec02a7e225cd353238ab67016ad829119d268c65e8e9f3f0"} Mar 13 14:14:49 crc kubenswrapper[4898]: I0313 14:14:49.603260 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" event={"ID":"e000d86e-e7a8-49ed-9184-fdd67dfe797d","Type":"ContainerStarted","Data":"b90a231f04c7fa997bdb6c4af06b79391dc9d6f78d2d6a60c3d05ec12274e55f"} Mar 13 14:14:49 crc kubenswrapper[4898]: I0313 14:14:49.604073 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:49 crc kubenswrapper[4898]: I0313 14:14:49.636969 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" podStartSLOduration=2.5032037860000003 podStartE2EDuration="5.636947813s" podCreationTimestamp="2026-03-13 14:14:44 +0000 UTC" firstStartedPulling="2026-03-13 14:14:45.349171288 +0000 UTC m=+1120.350759527" lastFinishedPulling="2026-03-13 14:14:48.482915315 +0000 UTC m=+1123.484503554" observedRunningTime="2026-03-13 14:14:49.623461929 +0000 UTC m=+1124.625050168" watchObservedRunningTime="2026-03-13 14:14:49.636947813 +0000 UTC m=+1124.638536052" Mar 13 14:14:50 crc kubenswrapper[4898]: I0313 14:14:50.612781 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" event={"ID":"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc","Type":"ContainerStarted","Data":"78965365c8028eb7d18d67270a6ee0613c969f858c38f2b673207ae3e402b3bf"} Mar 13 14:14:50 crc kubenswrapper[4898]: I0313 14:14:50.640508 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" podStartSLOduration=1.8932910120000002 podStartE2EDuration="6.640487877s" podCreationTimestamp="2026-03-13 14:14:44 +0000 UTC" firstStartedPulling="2026-03-13 14:14:45.661999916 +0000 UTC m=+1120.663588155" lastFinishedPulling="2026-03-13 14:14:50.409196781 +0000 UTC m=+1125.410785020" observedRunningTime="2026-03-13 14:14:50.635012693 +0000 UTC m=+1125.636600962" watchObservedRunningTime="2026-03-13 14:14:50.640487877 +0000 UTC m=+1125.642076116" Mar 13 14:14:51 crc kubenswrapper[4898]: I0313 14:14:51.621281 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.135360 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l"] Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.137141 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.139873 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.141381 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.150397 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l"] Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.278353 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9b296c2-5046-40b3-9fca-be350cf5de3e-secret-volume\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.278429 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9b296c2-5046-40b3-9fca-be350cf5de3e-config-volume\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.278483 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t2bc\" (UniqueName: \"kubernetes.io/projected/d9b296c2-5046-40b3-9fca-be350cf5de3e-kube-api-access-9t2bc\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.380677 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9b296c2-5046-40b3-9fca-be350cf5de3e-secret-volume\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.381088 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9b296c2-5046-40b3-9fca-be350cf5de3e-config-volume\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.381155 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t2bc\" (UniqueName: \"kubernetes.io/projected/d9b296c2-5046-40b3-9fca-be350cf5de3e-kube-api-access-9t2bc\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.382249 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9b296c2-5046-40b3-9fca-be350cf5de3e-config-volume\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.387172 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9b296c2-5046-40b3-9fca-be350cf5de3e-secret-volume\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.400098 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t2bc\" (UniqueName: \"kubernetes.io/projected/d9b296c2-5046-40b3-9fca-be350cf5de3e-kube-api-access-9t2bc\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.456305 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.919921 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l"] Mar 13 14:15:00 crc kubenswrapper[4898]: W0313 14:15:00.923519 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9b296c2_5046_40b3_9fca_be350cf5de3e.slice/crio-d2e883d004cc3649ee14f3853acd916269a70d8ad9d71dfe40bbbccb882dec77 WatchSource:0}: Error finding container d2e883d004cc3649ee14f3853acd916269a70d8ad9d71dfe40bbbccb882dec77: Status 404 returned error can't find the container with id d2e883d004cc3649ee14f3853acd916269a70d8ad9d71dfe40bbbccb882dec77 Mar 13 14:15:01 crc kubenswrapper[4898]: I0313 14:15:01.698493 4898 generic.go:334] "Generic (PLEG): container finished" podID="d9b296c2-5046-40b3-9fca-be350cf5de3e" containerID="183f0c268935ae6820699911fc0be58b4d0e93db5e614c9661b0f4b96dcc6afd" exitCode=0 Mar 13 14:15:01 crc kubenswrapper[4898]: I0313 14:15:01.698606 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" event={"ID":"d9b296c2-5046-40b3-9fca-be350cf5de3e","Type":"ContainerDied","Data":"183f0c268935ae6820699911fc0be58b4d0e93db5e614c9661b0f4b96dcc6afd"} Mar 13 14:15:01 crc kubenswrapper[4898]: I0313 14:15:01.698767 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" event={"ID":"d9b296c2-5046-40b3-9fca-be350cf5de3e","Type":"ContainerStarted","Data":"d2e883d004cc3649ee14f3853acd916269a70d8ad9d71dfe40bbbccb882dec77"} Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.130318 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.230886 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9b296c2-5046-40b3-9fca-be350cf5de3e-config-volume\") pod \"d9b296c2-5046-40b3-9fca-be350cf5de3e\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.231213 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t2bc\" (UniqueName: \"kubernetes.io/projected/d9b296c2-5046-40b3-9fca-be350cf5de3e-kube-api-access-9t2bc\") pod \"d9b296c2-5046-40b3-9fca-be350cf5de3e\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.231345 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9b296c2-5046-40b3-9fca-be350cf5de3e-secret-volume\") pod \"d9b296c2-5046-40b3-9fca-be350cf5de3e\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.235426 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9b296c2-5046-40b3-9fca-be350cf5de3e-config-volume" (OuterVolumeSpecName: "config-volume") pod "d9b296c2-5046-40b3-9fca-be350cf5de3e" (UID: "d9b296c2-5046-40b3-9fca-be350cf5de3e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.239035 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9b296c2-5046-40b3-9fca-be350cf5de3e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d9b296c2-5046-40b3-9fca-be350cf5de3e" (UID: "d9b296c2-5046-40b3-9fca-be350cf5de3e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.239110 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b296c2-5046-40b3-9fca-be350cf5de3e-kube-api-access-9t2bc" (OuterVolumeSpecName: "kube-api-access-9t2bc") pod "d9b296c2-5046-40b3-9fca-be350cf5de3e" (UID: "d9b296c2-5046-40b3-9fca-be350cf5de3e"). InnerVolumeSpecName "kube-api-access-9t2bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.333077 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9b296c2-5046-40b3-9fca-be350cf5de3e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.333112 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t2bc\" (UniqueName: \"kubernetes.io/projected/d9b296c2-5046-40b3-9fca-be350cf5de3e-kube-api-access-9t2bc\") on node \"crc\" DevicePath \"\"" Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.333124 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9b296c2-5046-40b3-9fca-be350cf5de3e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.718833 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" event={"ID":"d9b296c2-5046-40b3-9fca-be350cf5de3e","Type":"ContainerDied","Data":"d2e883d004cc3649ee14f3853acd916269a70d8ad9d71dfe40bbbccb882dec77"} Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.718887 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2e883d004cc3649ee14f3853acd916269a70d8ad9d71dfe40bbbccb882dec77" Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.718960 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:05 crc kubenswrapper[4898]: I0313 14:15:05.169556 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:15:19 crc kubenswrapper[4898]: I0313 14:15:19.134501 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:15:19 crc kubenswrapper[4898]: I0313 14:15:19.135184 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:15:24 crc kubenswrapper[4898]: I0313 14:15:24.807572 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.716008 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-bqmxg"] Mar 13 14:15:25 crc kubenswrapper[4898]: E0313 14:15:25.716848 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b296c2-5046-40b3-9fca-be350cf5de3e" containerName="collect-profiles" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.716868 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b296c2-5046-40b3-9fca-be350cf5de3e" containerName="collect-profiles" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.717157 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b296c2-5046-40b3-9fca-be350cf5de3e" containerName="collect-profiles" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.723730 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.730868 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-9sgbj" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.731148 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.731396 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.736032 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5"] Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.739254 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.749351 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.772026 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5"] Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.820965 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-g5gqr"] Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.822417 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.824715 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-plgcw" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.824768 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.824921 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.825155 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.825872 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-cx422"] Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.827155 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.836731 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.854572 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-cx422"] Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.863854 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-reloader\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.863936 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-conf\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.863981 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864026 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmcwx\" (UniqueName: \"kubernetes.io/projected/604b9c21-3e85-4c2e-9faf-962f44236911-kube-api-access-pmcwx\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p4w5\" (UID: \"604b9c21-3e85-4c2e-9faf-962f44236911\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864053 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-metrics\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864077 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-metrics-certs\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864115 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/edfd91ee-1246-43b2-84a0-95ea069de402-metallb-excludel2\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864156 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpg26\" (UniqueName: \"kubernetes.io/projected/edfd91ee-1246-43b2-84a0-95ea069de402-kube-api-access-cpg26\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864195 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-startup\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864274 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-sockets\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864297 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a7fcb96-7168-4049-8c28-d3f740599e48-metrics-certs\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864323 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv68p\" (UniqueName: \"kubernetes.io/projected/1a7fcb96-7168-4049-8c28-d3f740599e48-kube-api-access-rv68p\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864467 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/604b9c21-3e85-4c2e-9faf-962f44236911-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p4w5\" (UID: \"604b9c21-3e85-4c2e-9faf-962f44236911\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965702 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmcwx\" (UniqueName: \"kubernetes.io/projected/604b9c21-3e85-4c2e-9faf-962f44236911-kube-api-access-pmcwx\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p4w5\" (UID: \"604b9c21-3e85-4c2e-9faf-962f44236911\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965745 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-metrics\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965766 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-metrics-certs\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965795 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/edfd91ee-1246-43b2-84a0-95ea069de402-metallb-excludel2\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965826 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpg26\" (UniqueName: \"kubernetes.io/projected/edfd91ee-1246-43b2-84a0-95ea069de402-kube-api-access-cpg26\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965843 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-startup\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965869 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b231c7db-5056-4ec6-a64c-0aa8bdff336b-cert\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965886 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b231c7db-5056-4ec6-a64c-0aa8bdff336b-metrics-certs\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965923 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-sockets\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965940 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a7fcb96-7168-4049-8c28-d3f740599e48-metrics-certs\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965957 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv68p\" (UniqueName: \"kubernetes.io/projected/1a7fcb96-7168-4049-8c28-d3f740599e48-kube-api-access-rv68p\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965974 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb7vq\" (UniqueName: \"kubernetes.io/projected/b231c7db-5056-4ec6-a64c-0aa8bdff336b-kube-api-access-rb7vq\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.966012 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/604b9c21-3e85-4c2e-9faf-962f44236911-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p4w5\" (UID: \"604b9c21-3e85-4c2e-9faf-962f44236911\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.966037 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-reloader\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.966060 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-conf\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.966085 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: E0313 14:15:25.966198 4898 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 14:15:25 crc kubenswrapper[4898]: E0313 14:15:25.966237 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist podName:edfd91ee-1246-43b2-84a0-95ea069de402 nodeName:}" failed. No retries permitted until 2026-03-13 14:15:26.466223932 +0000 UTC m=+1161.467812171 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist") pod "speaker-g5gqr" (UID: "edfd91ee-1246-43b2-84a0-95ea069de402") : secret "metallb-memberlist" not found Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.966349 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-metrics\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.966600 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-reloader\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.966702 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/edfd91ee-1246-43b2-84a0-95ea069de402-metallb-excludel2\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.966978 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-conf\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.967282 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-startup\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.968173 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-sockets\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.971972 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-metrics-certs\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.974205 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/604b9c21-3e85-4c2e-9faf-962f44236911-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p4w5\" (UID: \"604b9c21-3e85-4c2e-9faf-962f44236911\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.977732 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a7fcb96-7168-4049-8c28-d3f740599e48-metrics-certs\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.981867 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv68p\" (UniqueName: \"kubernetes.io/projected/1a7fcb96-7168-4049-8c28-d3f740599e48-kube-api-access-rv68p\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.983361 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpg26\" (UniqueName: \"kubernetes.io/projected/edfd91ee-1246-43b2-84a0-95ea069de402-kube-api-access-cpg26\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.988774 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmcwx\" (UniqueName: \"kubernetes.io/projected/604b9c21-3e85-4c2e-9faf-962f44236911-kube-api-access-pmcwx\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p4w5\" (UID: \"604b9c21-3e85-4c2e-9faf-962f44236911\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.056431 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.067098 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b231c7db-5056-4ec6-a64c-0aa8bdff336b-cert\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.067436 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b231c7db-5056-4ec6-a64c-0aa8bdff336b-metrics-certs\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.067466 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb7vq\" (UniqueName: \"kubernetes.io/projected/b231c7db-5056-4ec6-a64c-0aa8bdff336b-kube-api-access-rb7vq\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.068684 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.072023 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b231c7db-5056-4ec6-a64c-0aa8bdff336b-metrics-certs\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.082746 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b231c7db-5056-4ec6-a64c-0aa8bdff336b-cert\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.086319 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb7vq\" (UniqueName: \"kubernetes.io/projected/b231c7db-5056-4ec6-a64c-0aa8bdff336b-kube-api-access-rb7vq\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.089344 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.154964 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.293425 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.475028 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:26 crc kubenswrapper[4898]: E0313 14:15:26.475242 4898 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 14:15:26 crc kubenswrapper[4898]: E0313 14:15:26.475330 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist podName:edfd91ee-1246-43b2-84a0-95ea069de402 nodeName:}" failed. No retries permitted until 2026-03-13 14:15:27.475312776 +0000 UTC m=+1162.476901015 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist") pod "speaker-g5gqr" (UID: "edfd91ee-1246-43b2-84a0-95ea069de402") : secret "metallb-memberlist" not found Mar 13 14:15:26 crc kubenswrapper[4898]: W0313 14:15:26.578631 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod604b9c21_3e85_4c2e_9faf_962f44236911.slice/crio-2b552d780586afb0605f5b69a9f12b031dd06d6cf8adbd55a7e6ebd36992ffa0 WatchSource:0}: Error finding container 2b552d780586afb0605f5b69a9f12b031dd06d6cf8adbd55a7e6ebd36992ffa0: Status 404 returned error can't find the container with id 2b552d780586afb0605f5b69a9f12b031dd06d6cf8adbd55a7e6ebd36992ffa0 Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.584400 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5"] Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.643201 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-cx422"] Mar 13 14:15:26 crc kubenswrapper[4898]: W0313 14:15:26.651603 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb231c7db_5056_4ec6_a64c_0aa8bdff336b.slice/crio-04ffe12997992c8a5b8bb20d4b14e8c303455b2a384628b905b1b8268f449f35 WatchSource:0}: Error finding container 04ffe12997992c8a5b8bb20d4b14e8c303455b2a384628b905b1b8268f449f35: Status 404 returned error can't find the container with id 04ffe12997992c8a5b8bb20d4b14e8c303455b2a384628b905b1b8268f449f35 Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.897912 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" event={"ID":"604b9c21-3e85-4c2e-9faf-962f44236911","Type":"ContainerStarted","Data":"2b552d780586afb0605f5b69a9f12b031dd06d6cf8adbd55a7e6ebd36992ffa0"} Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.898709 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerStarted","Data":"f6c1181539f490967414705ffcf64a141f8d718f3fbe65ea8c899231bca82f71"} Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.900379 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-cx422" event={"ID":"b231c7db-5056-4ec6-a64c-0aa8bdff336b","Type":"ContainerStarted","Data":"8d7d216fe78f04af98d68f17acb880630f795a36fde0f4603b51086769954d5b"} Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.900435 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-cx422" event={"ID":"b231c7db-5056-4ec6-a64c-0aa8bdff336b","Type":"ContainerStarted","Data":"50c031524bb2bf8d6fa08ff899e9bd2f79f477e7c3fd71037c11b22610fc948c"} Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.900450 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-cx422" event={"ID":"b231c7db-5056-4ec6-a64c-0aa8bdff336b","Type":"ContainerStarted","Data":"04ffe12997992c8a5b8bb20d4b14e8c303455b2a384628b905b1b8268f449f35"} Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.900533 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.922593 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-cx422" podStartSLOduration=1.9225746959999999 podStartE2EDuration="1.922574696s" podCreationTimestamp="2026-03-13 14:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:15:26.922057713 +0000 UTC m=+1161.923645992" watchObservedRunningTime="2026-03-13 14:15:26.922574696 +0000 UTC m=+1161.924162935" Mar 13 14:15:27 crc kubenswrapper[4898]: I0313 14:15:27.493091 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:27 crc kubenswrapper[4898]: I0313 14:15:27.498381 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:27 crc kubenswrapper[4898]: I0313 14:15:27.641510 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-g5gqr" Mar 13 14:15:27 crc kubenswrapper[4898]: W0313 14:15:27.680113 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedfd91ee_1246_43b2_84a0_95ea069de402.slice/crio-83e4c4b5eb654c2ea9fe0d7dd5c89cc6ac265f4391f16fb24a87c7d392fac38a WatchSource:0}: Error finding container 83e4c4b5eb654c2ea9fe0d7dd5c89cc6ac265f4391f16fb24a87c7d392fac38a: Status 404 returned error can't find the container with id 83e4c4b5eb654c2ea9fe0d7dd5c89cc6ac265f4391f16fb24a87c7d392fac38a Mar 13 14:15:27 crc kubenswrapper[4898]: I0313 14:15:27.909258 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-g5gqr" event={"ID":"edfd91ee-1246-43b2-84a0-95ea069de402","Type":"ContainerStarted","Data":"83e4c4b5eb654c2ea9fe0d7dd5c89cc6ac265f4391f16fb24a87c7d392fac38a"} Mar 13 14:15:28 crc kubenswrapper[4898]: I0313 14:15:28.926252 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-g5gqr" event={"ID":"edfd91ee-1246-43b2-84a0-95ea069de402","Type":"ContainerStarted","Data":"a39b33f1d3a64233bdc731a83d8fc40daebfa1230066627092def13b04432916"} Mar 13 14:15:28 crc kubenswrapper[4898]: I0313 14:15:28.926300 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-g5gqr" event={"ID":"edfd91ee-1246-43b2-84a0-95ea069de402","Type":"ContainerStarted","Data":"0a6c68b1de1e0d7624616ab17ffb737c14c0199744057f8f2a77ece2db6660a6"} Mar 13 14:15:28 crc kubenswrapper[4898]: I0313 14:15:28.926450 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-g5gqr" Mar 13 14:15:28 crc kubenswrapper[4898]: I0313 14:15:28.946141 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-g5gqr" podStartSLOduration=3.946115828 podStartE2EDuration="3.946115828s" podCreationTimestamp="2026-03-13 14:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:15:28.940892331 +0000 UTC m=+1163.942480580" watchObservedRunningTime="2026-03-13 14:15:28.946115828 +0000 UTC m=+1163.947704067" Mar 13 14:15:34 crc kubenswrapper[4898]: I0313 14:15:34.976562 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" event={"ID":"604b9c21-3e85-4c2e-9faf-962f44236911","Type":"ContainerStarted","Data":"c2e042db91577156269309aca160b7b8767ed8c465b79f1539698cd7a6652d49"} Mar 13 14:15:34 crc kubenswrapper[4898]: I0313 14:15:34.977865 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:34 crc kubenswrapper[4898]: I0313 14:15:34.978809 4898 generic.go:334] "Generic (PLEG): container finished" podID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerID="475580e194896bcb384f743508d0bda7bb9c9de88dbe8f57a8106e5e13db5a02" exitCode=0 Mar 13 14:15:34 crc kubenswrapper[4898]: I0313 14:15:34.978883 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerDied","Data":"475580e194896bcb384f743508d0bda7bb9c9de88dbe8f57a8106e5e13db5a02"} Mar 13 14:15:35 crc kubenswrapper[4898]: I0313 14:15:35.005633 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" podStartSLOduration=2.381998356 podStartE2EDuration="10.005614839s" podCreationTimestamp="2026-03-13 14:15:25 +0000 UTC" firstStartedPulling="2026-03-13 14:15:26.580664654 +0000 UTC m=+1161.582252883" lastFinishedPulling="2026-03-13 14:15:34.204281127 +0000 UTC m=+1169.205869366" observedRunningTime="2026-03-13 14:15:35.003035772 +0000 UTC m=+1170.004624021" watchObservedRunningTime="2026-03-13 14:15:35.005614839 +0000 UTC m=+1170.007203088" Mar 13 14:15:35 crc kubenswrapper[4898]: I0313 14:15:35.993223 4898 generic.go:334] "Generic (PLEG): container finished" podID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerID="479e8d77c1638a8eb14315977e43a32b6023d8fdd6daad9a116915f891125d98" exitCode=0 Mar 13 14:15:35 crc kubenswrapper[4898]: I0313 14:15:35.993290 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerDied","Data":"479e8d77c1638a8eb14315977e43a32b6023d8fdd6daad9a116915f891125d98"} Mar 13 14:15:36 crc kubenswrapper[4898]: I0313 14:15:36.162403 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:37 crc kubenswrapper[4898]: I0313 14:15:37.006050 4898 generic.go:334] "Generic (PLEG): container finished" podID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerID="577e02f1140114d41ba083f8d1376f0da51fcfc7633acb7ea7637c8fd8269feb" exitCode=0 Mar 13 14:15:37 crc kubenswrapper[4898]: I0313 14:15:37.006113 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerDied","Data":"577e02f1140114d41ba083f8d1376f0da51fcfc7633acb7ea7637c8fd8269feb"} Mar 13 14:15:37 crc kubenswrapper[4898]: I0313 14:15:37.645373 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-g5gqr" Mar 13 14:15:38 crc kubenswrapper[4898]: I0313 14:15:38.017869 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerStarted","Data":"de801e5b6035a3dd9252ae2d1f506f270dfe0c2552670023b6eac179c4aedccb"} Mar 13 14:15:38 crc kubenswrapper[4898]: I0313 14:15:38.017937 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerStarted","Data":"7ef81ddc4c5349a073203bfca8c071431a5874e9b13e0ebd50b075b4cfc8ae59"} Mar 13 14:15:38 crc kubenswrapper[4898]: I0313 14:15:38.017945 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerStarted","Data":"7be51b5ddbec2ffddde6f1aa02f465de8d01bcf3f148bc9e5694be9c2d0a3885"} Mar 13 14:15:38 crc kubenswrapper[4898]: I0313 14:15:38.017970 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerStarted","Data":"bea18a0b4bb5bbad650ae6e8e67ba0820f4376de6521e05324b889bd9cd86809"} Mar 13 14:15:38 crc kubenswrapper[4898]: I0313 14:15:38.017979 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerStarted","Data":"5caa38ab91ff3bb9407ccdc085db3c665cadd77289d8c9655d76db3e95c56dc9"} Mar 13 14:15:39 crc kubenswrapper[4898]: I0313 14:15:39.031642 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerStarted","Data":"eb8a82178822c3aac08c0234f2ad15766fba6323092ad5172a80486bc675d301"} Mar 13 14:15:39 crc kubenswrapper[4898]: I0313 14:15:39.032107 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:39 crc kubenswrapper[4898]: I0313 14:15:39.066281 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-bqmxg" podStartSLOduration=6.204330854 podStartE2EDuration="14.066258137s" podCreationTimestamp="2026-03-13 14:15:25 +0000 UTC" firstStartedPulling="2026-03-13 14:15:26.29312156 +0000 UTC m=+1161.294709799" lastFinishedPulling="2026-03-13 14:15:34.155048843 +0000 UTC m=+1169.156637082" observedRunningTime="2026-03-13 14:15:39.064412979 +0000 UTC m=+1174.066001248" watchObservedRunningTime="2026-03-13 14:15:39.066258137 +0000 UTC m=+1174.067846396" Mar 13 14:15:40 crc kubenswrapper[4898]: I0313 14:15:40.830640 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-cw75t"] Mar 13 14:15:40 crc kubenswrapper[4898]: I0313 14:15:40.832022 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cw75t" Mar 13 14:15:40 crc kubenswrapper[4898]: I0313 14:15:40.836410 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 13 14:15:40 crc kubenswrapper[4898]: I0313 14:15:40.837948 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cw75t"] Mar 13 14:15:40 crc kubenswrapper[4898]: I0313 14:15:40.838948 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-bp8l6" Mar 13 14:15:40 crc kubenswrapper[4898]: I0313 14:15:40.839258 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 13 14:15:40 crc kubenswrapper[4898]: I0313 14:15:40.934000 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znkw5\" (UniqueName: \"kubernetes.io/projected/a3872ed6-e59e-42fe-a774-c457f7118f65-kube-api-access-znkw5\") pod \"openstack-operator-index-cw75t\" (UID: \"a3872ed6-e59e-42fe-a774-c457f7118f65\") " pod="openstack-operators/openstack-operator-index-cw75t" Mar 13 14:15:41 crc kubenswrapper[4898]: I0313 14:15:41.036185 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znkw5\" (UniqueName: \"kubernetes.io/projected/a3872ed6-e59e-42fe-a774-c457f7118f65-kube-api-access-znkw5\") pod \"openstack-operator-index-cw75t\" (UID: \"a3872ed6-e59e-42fe-a774-c457f7118f65\") " pod="openstack-operators/openstack-operator-index-cw75t" Mar 13 14:15:41 crc kubenswrapper[4898]: I0313 14:15:41.057107 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:41 crc kubenswrapper[4898]: I0313 14:15:41.077361 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znkw5\" (UniqueName: \"kubernetes.io/projected/a3872ed6-e59e-42fe-a774-c457f7118f65-kube-api-access-znkw5\") pod \"openstack-operator-index-cw75t\" (UID: \"a3872ed6-e59e-42fe-a774-c457f7118f65\") " pod="openstack-operators/openstack-operator-index-cw75t" Mar 13 14:15:41 crc kubenswrapper[4898]: I0313 14:15:41.106589 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:41 crc kubenswrapper[4898]: I0313 14:15:41.170345 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cw75t" Mar 13 14:15:41 crc kubenswrapper[4898]: I0313 14:15:41.612184 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cw75t"] Mar 13 14:15:42 crc kubenswrapper[4898]: I0313 14:15:42.055799 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cw75t" event={"ID":"a3872ed6-e59e-42fe-a774-c457f7118f65","Type":"ContainerStarted","Data":"d1eb15a6f8d2097c70c293cd2b58d045d8fb2028cb2810ed0aa67bff167518c8"} Mar 13 14:15:44 crc kubenswrapper[4898]: I0313 14:15:44.198112 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-cw75t"] Mar 13 14:15:44 crc kubenswrapper[4898]: I0313 14:15:44.816466 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-9k7p6"] Mar 13 14:15:44 crc kubenswrapper[4898]: I0313 14:15:44.818979 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:44 crc kubenswrapper[4898]: I0313 14:15:44.823044 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9k7p6"] Mar 13 14:15:45 crc kubenswrapper[4898]: I0313 14:15:45.012624 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7pvs\" (UniqueName: \"kubernetes.io/projected/478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8-kube-api-access-v7pvs\") pod \"openstack-operator-index-9k7p6\" (UID: \"478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8\") " pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:45 crc kubenswrapper[4898]: I0313 14:15:45.114799 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7pvs\" (UniqueName: \"kubernetes.io/projected/478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8-kube-api-access-v7pvs\") pod \"openstack-operator-index-9k7p6\" (UID: \"478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8\") " pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:45 crc kubenswrapper[4898]: I0313 14:15:45.132973 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7pvs\" (UniqueName: \"kubernetes.io/projected/478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8-kube-api-access-v7pvs\") pod \"openstack-operator-index-9k7p6\" (UID: \"478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8\") " pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:45 crc kubenswrapper[4898]: I0313 14:15:45.151579 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:45 crc kubenswrapper[4898]: I0313 14:15:45.599206 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9k7p6"] Mar 13 14:15:45 crc kubenswrapper[4898]: W0313 14:15:45.792737 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod478795f5_c2f6_4e9b_9ed6_e2c743c3f3b8.slice/crio-563f2abeaf9a8290bd4eaa6a51c29469074d221de340d7876336cf384f67b85c WatchSource:0}: Error finding container 563f2abeaf9a8290bd4eaa6a51c29469074d221de340d7876336cf384f67b85c: Status 404 returned error can't find the container with id 563f2abeaf9a8290bd4eaa6a51c29469074d221de340d7876336cf384f67b85c Mar 13 14:15:46 crc kubenswrapper[4898]: I0313 14:15:46.095008 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9k7p6" event={"ID":"478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8","Type":"ContainerStarted","Data":"563f2abeaf9a8290bd4eaa6a51c29469074d221de340d7876336cf384f67b85c"} Mar 13 14:15:46 crc kubenswrapper[4898]: I0313 14:15:46.095499 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:49 crc kubenswrapper[4898]: I0313 14:15:49.134666 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:15:49 crc kubenswrapper[4898]: I0313 14:15:49.135534 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.134147 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9k7p6" event={"ID":"478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8","Type":"ContainerStarted","Data":"9d49c37f8ba27635a927bfd693d7763d0844726f817979fc7574c7aec133f0d7"} Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.136090 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cw75t" event={"ID":"a3872ed6-e59e-42fe-a774-c457f7118f65","Type":"ContainerStarted","Data":"f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c"} Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.136169 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-cw75t" podUID="a3872ed6-e59e-42fe-a774-c457f7118f65" containerName="registry-server" containerID="cri-o://f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c" gracePeriod=2 Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.152370 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-9k7p6" podStartSLOduration=2.993428846 podStartE2EDuration="7.152353057s" podCreationTimestamp="2026-03-13 14:15:44 +0000 UTC" firstStartedPulling="2026-03-13 14:15:45.795032638 +0000 UTC m=+1180.796620877" lastFinishedPulling="2026-03-13 14:15:49.953956849 +0000 UTC m=+1184.955545088" observedRunningTime="2026-03-13 14:15:51.145751965 +0000 UTC m=+1186.147340224" watchObservedRunningTime="2026-03-13 14:15:51.152353057 +0000 UTC m=+1186.153941296" Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.164607 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-cw75t" podStartSLOduration=2.839843315 podStartE2EDuration="11.164591756s" podCreationTimestamp="2026-03-13 14:15:40 +0000 UTC" firstStartedPulling="2026-03-13 14:15:41.622598656 +0000 UTC m=+1176.624186895" lastFinishedPulling="2026-03-13 14:15:49.947347077 +0000 UTC m=+1184.948935336" observedRunningTime="2026-03-13 14:15:51.162111842 +0000 UTC m=+1186.163700101" watchObservedRunningTime="2026-03-13 14:15:51.164591756 +0000 UTC m=+1186.166179995" Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.170407 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-cw75t" Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.556841 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cw75t" Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.651562 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znkw5\" (UniqueName: \"kubernetes.io/projected/a3872ed6-e59e-42fe-a774-c457f7118f65-kube-api-access-znkw5\") pod \"a3872ed6-e59e-42fe-a774-c457f7118f65\" (UID: \"a3872ed6-e59e-42fe-a774-c457f7118f65\") " Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.657547 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3872ed6-e59e-42fe-a774-c457f7118f65-kube-api-access-znkw5" (OuterVolumeSpecName: "kube-api-access-znkw5") pod "a3872ed6-e59e-42fe-a774-c457f7118f65" (UID: "a3872ed6-e59e-42fe-a774-c457f7118f65"). InnerVolumeSpecName "kube-api-access-znkw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.753267 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znkw5\" (UniqueName: \"kubernetes.io/projected/a3872ed6-e59e-42fe-a774-c457f7118f65-kube-api-access-znkw5\") on node \"crc\" DevicePath \"\"" Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.153949 4898 generic.go:334] "Generic (PLEG): container finished" podID="a3872ed6-e59e-42fe-a774-c457f7118f65" containerID="f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c" exitCode=0 Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.154025 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cw75t" Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.154039 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cw75t" event={"ID":"a3872ed6-e59e-42fe-a774-c457f7118f65","Type":"ContainerDied","Data":"f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c"} Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.154124 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cw75t" event={"ID":"a3872ed6-e59e-42fe-a774-c457f7118f65","Type":"ContainerDied","Data":"d1eb15a6f8d2097c70c293cd2b58d045d8fb2028cb2810ed0aa67bff167518c8"} Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.154155 4898 scope.go:117] "RemoveContainer" containerID="f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c" Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.177789 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-cw75t"] Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.179767 4898 scope.go:117] "RemoveContainer" containerID="f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c" Mar 13 14:15:52 crc kubenswrapper[4898]: E0313 14:15:52.180112 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c\": container with ID starting with f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c not found: ID does not exist" containerID="f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c" Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.180139 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c"} err="failed to get container status \"f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c\": rpc error: code = NotFound desc = could not find container \"f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c\": container with ID starting with f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c not found: ID does not exist" Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.183564 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-cw75t"] Mar 13 14:15:53 crc kubenswrapper[4898]: I0313 14:15:53.756046 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3872ed6-e59e-42fe-a774-c457f7118f65" path="/var/lib/kubelet/pods/a3872ed6-e59e-42fe-a774-c457f7118f65/volumes" Mar 13 14:15:55 crc kubenswrapper[4898]: I0313 14:15:55.152520 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:55 crc kubenswrapper[4898]: I0313 14:15:55.152596 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:55 crc kubenswrapper[4898]: I0313 14:15:55.208368 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:55 crc kubenswrapper[4898]: I0313 14:15:55.239677 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:56 crc kubenswrapper[4898]: I0313 14:15:56.060250 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.128331 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556856-z8p4g"] Mar 13 14:16:00 crc kubenswrapper[4898]: E0313 14:16:00.129281 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3872ed6-e59e-42fe-a774-c457f7118f65" containerName="registry-server" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.129296 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3872ed6-e59e-42fe-a774-c457f7118f65" containerName="registry-server" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.129469 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3872ed6-e59e-42fe-a774-c457f7118f65" containerName="registry-server" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.130069 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.132314 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.132317 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.132439 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.135858 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556856-z8p4g"] Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.187923 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wclsp\" (UniqueName: \"kubernetes.io/projected/5b81468c-e1ac-4515-837d-993e3c5108c9-kube-api-access-wclsp\") pod \"auto-csr-approver-29556856-z8p4g\" (UID: \"5b81468c-e1ac-4515-837d-993e3c5108c9\") " pod="openshift-infra/auto-csr-approver-29556856-z8p4g" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.289490 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wclsp\" (UniqueName: \"kubernetes.io/projected/5b81468c-e1ac-4515-837d-993e3c5108c9-kube-api-access-wclsp\") pod \"auto-csr-approver-29556856-z8p4g\" (UID: \"5b81468c-e1ac-4515-837d-993e3c5108c9\") " pod="openshift-infra/auto-csr-approver-29556856-z8p4g" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.315343 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wclsp\" (UniqueName: \"kubernetes.io/projected/5b81468c-e1ac-4515-837d-993e3c5108c9-kube-api-access-wclsp\") pod \"auto-csr-approver-29556856-z8p4g\" (UID: \"5b81468c-e1ac-4515-837d-993e3c5108c9\") " pod="openshift-infra/auto-csr-approver-29556856-z8p4g" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.462919 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.928230 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556856-z8p4g"] Mar 13 14:16:00 crc kubenswrapper[4898]: W0313 14:16:00.931618 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b81468c_e1ac_4515_837d_993e3c5108c9.slice/crio-7e7aa59f8c94fe7e99e47923bf7d8d3fd5b746af74ca704c75a198f4d8a65fd6 WatchSource:0}: Error finding container 7e7aa59f8c94fe7e99e47923bf7d8d3fd5b746af74ca704c75a198f4d8a65fd6: Status 404 returned error can't find the container with id 7e7aa59f8c94fe7e99e47923bf7d8d3fd5b746af74ca704c75a198f4d8a65fd6 Mar 13 14:16:01 crc kubenswrapper[4898]: I0313 14:16:01.220337 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" event={"ID":"5b81468c-e1ac-4515-837d-993e3c5108c9","Type":"ContainerStarted","Data":"7e7aa59f8c94fe7e99e47923bf7d8d3fd5b746af74ca704c75a198f4d8a65fd6"} Mar 13 14:16:02 crc kubenswrapper[4898]: I0313 14:16:02.230726 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" event={"ID":"5b81468c-e1ac-4515-837d-993e3c5108c9","Type":"ContainerStarted","Data":"309417dd12bdceaad7cc8574de946b3ecc5729e4fa9390a27c026042338454ac"} Mar 13 14:16:02 crc kubenswrapper[4898]: I0313 14:16:02.243128 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" podStartSLOduration=1.205896884 podStartE2EDuration="2.243110647s" podCreationTimestamp="2026-03-13 14:16:00 +0000 UTC" firstStartedPulling="2026-03-13 14:16:00.933726654 +0000 UTC m=+1195.935314893" lastFinishedPulling="2026-03-13 14:16:01.970940417 +0000 UTC m=+1196.972528656" observedRunningTime="2026-03-13 14:16:02.241633748 +0000 UTC m=+1197.243221987" watchObservedRunningTime="2026-03-13 14:16:02.243110647 +0000 UTC m=+1197.244698896" Mar 13 14:16:03 crc kubenswrapper[4898]: I0313 14:16:03.241297 4898 generic.go:334] "Generic (PLEG): container finished" podID="5b81468c-e1ac-4515-837d-993e3c5108c9" containerID="309417dd12bdceaad7cc8574de946b3ecc5729e4fa9390a27c026042338454ac" exitCode=0 Mar 13 14:16:03 crc kubenswrapper[4898]: I0313 14:16:03.241412 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" event={"ID":"5b81468c-e1ac-4515-837d-993e3c5108c9","Type":"ContainerDied","Data":"309417dd12bdceaad7cc8574de946b3ecc5729e4fa9390a27c026042338454ac"} Mar 13 14:16:04 crc kubenswrapper[4898]: I0313 14:16:04.629169 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" Mar 13 14:16:04 crc kubenswrapper[4898]: I0313 14:16:04.662392 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wclsp\" (UniqueName: \"kubernetes.io/projected/5b81468c-e1ac-4515-837d-993e3c5108c9-kube-api-access-wclsp\") pod \"5b81468c-e1ac-4515-837d-993e3c5108c9\" (UID: \"5b81468c-e1ac-4515-837d-993e3c5108c9\") " Mar 13 14:16:04 crc kubenswrapper[4898]: I0313 14:16:04.668772 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b81468c-e1ac-4515-837d-993e3c5108c9-kube-api-access-wclsp" (OuterVolumeSpecName: "kube-api-access-wclsp") pod "5b81468c-e1ac-4515-837d-993e3c5108c9" (UID: "5b81468c-e1ac-4515-837d-993e3c5108c9"). InnerVolumeSpecName "kube-api-access-wclsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:16:04 crc kubenswrapper[4898]: I0313 14:16:04.765501 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wclsp\" (UniqueName: \"kubernetes.io/projected/5b81468c-e1ac-4515-837d-993e3c5108c9-kube-api-access-wclsp\") on node \"crc\" DevicePath \"\"" Mar 13 14:16:05 crc kubenswrapper[4898]: I0313 14:16:05.260773 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" event={"ID":"5b81468c-e1ac-4515-837d-993e3c5108c9","Type":"ContainerDied","Data":"7e7aa59f8c94fe7e99e47923bf7d8d3fd5b746af74ca704c75a198f4d8a65fd6"} Mar 13 14:16:05 crc kubenswrapper[4898]: I0313 14:16:05.261149 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e7aa59f8c94fe7e99e47923bf7d8d3fd5b746af74ca704c75a198f4d8a65fd6" Mar 13 14:16:05 crc kubenswrapper[4898]: I0313 14:16:05.260860 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" Mar 13 14:16:05 crc kubenswrapper[4898]: I0313 14:16:05.298595 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556850-n2t8z"] Mar 13 14:16:05 crc kubenswrapper[4898]: I0313 14:16:05.304050 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556850-n2t8z"] Mar 13 14:16:05 crc kubenswrapper[4898]: I0313 14:16:05.753453 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a05c0334-f9cf-4640-a763-6d77b983193c" path="/var/lib/kubelet/pods/a05c0334-f9cf-4640-a763-6d77b983193c/volumes" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.291615 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57"] Mar 13 14:16:11 crc kubenswrapper[4898]: E0313 14:16:11.292630 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b81468c-e1ac-4515-837d-993e3c5108c9" containerName="oc" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.292646 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b81468c-e1ac-4515-837d-993e3c5108c9" containerName="oc" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.292852 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b81468c-e1ac-4515-837d-993e3c5108c9" containerName="oc" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.294314 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.297503 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6x4vb" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.320068 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57"] Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.408399 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-util\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.408983 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjqpr\" (UniqueName: \"kubernetes.io/projected/a57932fc-ce83-4258-95a8-65f29c0cfd5a-kube-api-access-xjqpr\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.409149 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-bundle\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.510550 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-util\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.510686 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjqpr\" (UniqueName: \"kubernetes.io/projected/a57932fc-ce83-4258-95a8-65f29c0cfd5a-kube-api-access-xjqpr\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.510761 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-bundle\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.511700 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-util\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.511756 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-bundle\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.538363 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjqpr\" (UniqueName: \"kubernetes.io/projected/a57932fc-ce83-4258-95a8-65f29c0cfd5a-kube-api-access-xjqpr\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.641409 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:12 crc kubenswrapper[4898]: I0313 14:16:12.124147 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57"] Mar 13 14:16:12 crc kubenswrapper[4898]: I0313 14:16:12.342226 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" event={"ID":"a57932fc-ce83-4258-95a8-65f29c0cfd5a","Type":"ContainerStarted","Data":"de0737319c24d387d36033ef3d7fde262ae7e58a7a64d518eb94a500b00ccd7c"} Mar 13 14:16:12 crc kubenswrapper[4898]: I0313 14:16:12.342276 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" event={"ID":"a57932fc-ce83-4258-95a8-65f29c0cfd5a","Type":"ContainerStarted","Data":"edd3fbaa00c13c87b979dfe05e5d0b77bdb33d430bf8f77a8a32ca9cb4123a2b"} Mar 13 14:16:13 crc kubenswrapper[4898]: I0313 14:16:13.351925 4898 generic.go:334] "Generic (PLEG): container finished" podID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerID="de0737319c24d387d36033ef3d7fde262ae7e58a7a64d518eb94a500b00ccd7c" exitCode=0 Mar 13 14:16:13 crc kubenswrapper[4898]: I0313 14:16:13.352105 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" event={"ID":"a57932fc-ce83-4258-95a8-65f29c0cfd5a","Type":"ContainerDied","Data":"de0737319c24d387d36033ef3d7fde262ae7e58a7a64d518eb94a500b00ccd7c"} Mar 13 14:16:14 crc kubenswrapper[4898]: I0313 14:16:14.360462 4898 generic.go:334] "Generic (PLEG): container finished" podID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerID="19e101b600e89585fb9349651c696287b7553b5949128b2de6653d748f58d321" exitCode=0 Mar 13 14:16:14 crc kubenswrapper[4898]: I0313 14:16:14.360519 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" event={"ID":"a57932fc-ce83-4258-95a8-65f29c0cfd5a","Type":"ContainerDied","Data":"19e101b600e89585fb9349651c696287b7553b5949128b2de6653d748f58d321"} Mar 13 14:16:15 crc kubenswrapper[4898]: I0313 14:16:15.378671 4898 generic.go:334] "Generic (PLEG): container finished" podID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerID="9cf0186ece602fa99b331b4c0f59fabd6a8d9ecd8f719509581cd4626f43f447" exitCode=0 Mar 13 14:16:15 crc kubenswrapper[4898]: I0313 14:16:15.378729 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" event={"ID":"a57932fc-ce83-4258-95a8-65f29c0cfd5a","Type":"ContainerDied","Data":"9cf0186ece602fa99b331b4c0f59fabd6a8d9ecd8f719509581cd4626f43f447"} Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.649120 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.804436 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-bundle\") pod \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.804566 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-util\") pod \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.804795 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjqpr\" (UniqueName: \"kubernetes.io/projected/a57932fc-ce83-4258-95a8-65f29c0cfd5a-kube-api-access-xjqpr\") pod \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.806006 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-bundle" (OuterVolumeSpecName: "bundle") pod "a57932fc-ce83-4258-95a8-65f29c0cfd5a" (UID: "a57932fc-ce83-4258-95a8-65f29c0cfd5a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.807025 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.812649 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a57932fc-ce83-4258-95a8-65f29c0cfd5a-kube-api-access-xjqpr" (OuterVolumeSpecName: "kube-api-access-xjqpr") pod "a57932fc-ce83-4258-95a8-65f29c0cfd5a" (UID: "a57932fc-ce83-4258-95a8-65f29c0cfd5a"). InnerVolumeSpecName "kube-api-access-xjqpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.821347 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-util" (OuterVolumeSpecName: "util") pod "a57932fc-ce83-4258-95a8-65f29c0cfd5a" (UID: "a57932fc-ce83-4258-95a8-65f29c0cfd5a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.908877 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-util\") on node \"crc\" DevicePath \"\"" Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.908936 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjqpr\" (UniqueName: \"kubernetes.io/projected/a57932fc-ce83-4258-95a8-65f29c0cfd5a-kube-api-access-xjqpr\") on node \"crc\" DevicePath \"\"" Mar 13 14:16:17 crc kubenswrapper[4898]: I0313 14:16:17.403796 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" event={"ID":"a57932fc-ce83-4258-95a8-65f29c0cfd5a","Type":"ContainerDied","Data":"edd3fbaa00c13c87b979dfe05e5d0b77bdb33d430bf8f77a8a32ca9cb4123a2b"} Mar 13 14:16:17 crc kubenswrapper[4898]: I0313 14:16:17.403843 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edd3fbaa00c13c87b979dfe05e5d0b77bdb33d430bf8f77a8a32ca9cb4123a2b" Mar 13 14:16:17 crc kubenswrapper[4898]: I0313 14:16:17.403993 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:19 crc kubenswrapper[4898]: I0313 14:16:19.134974 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:16:19 crc kubenswrapper[4898]: I0313 14:16:19.135344 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:16:19 crc kubenswrapper[4898]: I0313 14:16:19.135406 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:16:19 crc kubenswrapper[4898]: I0313 14:16:19.136194 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b5d3972dfd92a1b971338153ac5467cf67b2057ca35cfb382b56be42ddca2ed"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:16:19 crc kubenswrapper[4898]: I0313 14:16:19.136266 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://7b5d3972dfd92a1b971338153ac5467cf67b2057ca35cfb382b56be42ddca2ed" gracePeriod=600 Mar 13 14:16:19 crc kubenswrapper[4898]: I0313 14:16:19.423027 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="7b5d3972dfd92a1b971338153ac5467cf67b2057ca35cfb382b56be42ddca2ed" exitCode=0 Mar 13 14:16:19 crc kubenswrapper[4898]: I0313 14:16:19.423096 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"7b5d3972dfd92a1b971338153ac5467cf67b2057ca35cfb382b56be42ddca2ed"} Mar 13 14:16:19 crc kubenswrapper[4898]: I0313 14:16:19.423137 4898 scope.go:117] "RemoveContainer" containerID="b58828da596890620679d1e69bfdfd0b7cd0cf06254eed4031e215964351d8c6" Mar 13 14:16:20 crc kubenswrapper[4898]: I0313 14:16:20.437763 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"37bdbe6f1a65f1530746827b4e6d1dd1ce95edb9a913051fc8fca9a782787e56"} Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.268186 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn"] Mar 13 14:16:24 crc kubenswrapper[4898]: E0313 14:16:24.269084 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerName="pull" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.269098 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerName="pull" Mar 13 14:16:24 crc kubenswrapper[4898]: E0313 14:16:24.269121 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerName="extract" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.269127 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerName="extract" Mar 13 14:16:24 crc kubenswrapper[4898]: E0313 14:16:24.269136 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerName="util" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.269141 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerName="util" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.269309 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerName="extract" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.269869 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.279320 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-6qc9g" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.308863 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn"] Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.459836 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x57z\" (UniqueName: \"kubernetes.io/projected/7bae49ab-1146-43a2-b436-69838c923f1a-kube-api-access-2x57z\") pod \"openstack-operator-controller-init-6b8c6b5df9-kk2gn\" (UID: \"7bae49ab-1146-43a2-b436-69838c923f1a\") " pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.562756 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x57z\" (UniqueName: \"kubernetes.io/projected/7bae49ab-1146-43a2-b436-69838c923f1a-kube-api-access-2x57z\") pod \"openstack-operator-controller-init-6b8c6b5df9-kk2gn\" (UID: \"7bae49ab-1146-43a2-b436-69838c923f1a\") " pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.583786 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x57z\" (UniqueName: \"kubernetes.io/projected/7bae49ab-1146-43a2-b436-69838c923f1a-kube-api-access-2x57z\") pod \"openstack-operator-controller-init-6b8c6b5df9-kk2gn\" (UID: \"7bae49ab-1146-43a2-b436-69838c923f1a\") " pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.604259 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" Mar 13 14:16:25 crc kubenswrapper[4898]: I0313 14:16:25.136763 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn"] Mar 13 14:16:25 crc kubenswrapper[4898]: I0313 14:16:25.485703 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" event={"ID":"7bae49ab-1146-43a2-b436-69838c923f1a","Type":"ContainerStarted","Data":"80f3b0f287b2875fcd6c6e0a40bc6e8aa4408ca1f5d0f62be3214f49a50f3e18"} Mar 13 14:16:29 crc kubenswrapper[4898]: I0313 14:16:29.519481 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" event={"ID":"7bae49ab-1146-43a2-b436-69838c923f1a","Type":"ContainerStarted","Data":"d87599acaa104726169cfe37672c0e6b8881bc6eb72714faae25c237e35b25d7"} Mar 13 14:16:29 crc kubenswrapper[4898]: I0313 14:16:29.520981 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" Mar 13 14:16:29 crc kubenswrapper[4898]: I0313 14:16:29.561949 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" podStartSLOduration=1.975188419 podStartE2EDuration="5.561929275s" podCreationTimestamp="2026-03-13 14:16:24 +0000 UTC" firstStartedPulling="2026-03-13 14:16:25.144314836 +0000 UTC m=+1220.145903075" lastFinishedPulling="2026-03-13 14:16:28.731055692 +0000 UTC m=+1223.732643931" observedRunningTime="2026-03-13 14:16:29.560041205 +0000 UTC m=+1224.561629464" watchObservedRunningTime="2026-03-13 14:16:29.561929275 +0000 UTC m=+1224.563517514" Mar 13 14:16:32 crc kubenswrapper[4898]: I0313 14:16:32.735669 4898 scope.go:117] "RemoveContainer" containerID="b0888e4d135b3b37fbe96fe16a03f870ba37d4188b89aa723dcdb2298a0e4ed8" Mar 13 14:16:34 crc kubenswrapper[4898]: I0313 14:16:34.608480 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.648690 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-gtlps"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.650814 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.654402 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-vp785" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.664785 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.665930 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.667838 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-nrczs" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.670309 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-gtlps"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.681414 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.682446 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.685273 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-zjpd9" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.701693 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.737416 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.765246 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.766245 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.770702 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-9fmzd" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.813105 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.814471 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.816159 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l2jg\" (UniqueName: \"kubernetes.io/projected/0d88a5d2-a852-409e-b4bd-939d1c2b9090-kube-api-access-4l2jg\") pod \"cinder-operator-controller-manager-984cd4dcf-p9d5v\" (UID: \"0d88a5d2-a852-409e-b4bd-939d1c2b9090\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.816221 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5rn7\" (UniqueName: \"kubernetes.io/projected/3c955ebc-98fd-4921-9923-6151a50e8eec-kube-api-access-g5rn7\") pod \"designate-operator-controller-manager-66d56f6ff4-4n5rx\" (UID: \"3c955ebc-98fd-4921-9923-6151a50e8eec\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.816358 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm7pb\" (UniqueName: \"kubernetes.io/projected/45efd8ce-26db-4511-bd88-2e7467d02bbb-kube-api-access-xm7pb\") pod \"barbican-operator-controller-manager-d47688694-gtlps\" (UID: \"45efd8ce-26db-4511-bd88-2e7467d02bbb\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.817442 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-2zg6t" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.821207 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.837834 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.877307 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.878676 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.886006 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9kvzj" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.892153 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.930394 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l2jg\" (UniqueName: \"kubernetes.io/projected/0d88a5d2-a852-409e-b4bd-939d1c2b9090-kube-api-access-4l2jg\") pod \"cinder-operator-controller-manager-984cd4dcf-p9d5v\" (UID: \"0d88a5d2-a852-409e-b4bd-939d1c2b9090\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.930538 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5rn7\" (UniqueName: \"kubernetes.io/projected/3c955ebc-98fd-4921-9923-6151a50e8eec-kube-api-access-g5rn7\") pod \"designate-operator-controller-manager-66d56f6ff4-4n5rx\" (UID: \"3c955ebc-98fd-4921-9923-6151a50e8eec\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.930830 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm7pb\" (UniqueName: \"kubernetes.io/projected/45efd8ce-26db-4511-bd88-2e7467d02bbb-kube-api-access-xm7pb\") pod \"barbican-operator-controller-manager-d47688694-gtlps\" (UID: \"45efd8ce-26db-4511-bd88-2e7467d02bbb\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.930962 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r94zq\" (UniqueName: \"kubernetes.io/projected/fb7b2f97-fca8-41d2-9be7-d40fac94c171-kube-api-access-r94zq\") pod \"glance-operator-controller-manager-5964f64c48-mf8h6\" (UID: \"fb7b2f97-fca8-41d2-9be7-d40fac94c171\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.931033 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xlhk\" (UniqueName: \"kubernetes.io/projected/ea0ad033-9a48-4e42-a237-f27cacf03adc-kube-api-access-8xlhk\") pod \"heat-operator-controller-manager-77b6666d85-tqp4b\" (UID: \"ea0ad033-9a48-4e42-a237-f27cacf03adc\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.951771 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.953431 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.970481 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-j4b7h" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.987163 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.988281 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm7pb\" (UniqueName: \"kubernetes.io/projected/45efd8ce-26db-4511-bd88-2e7467d02bbb-kube-api-access-xm7pb\") pod \"barbican-operator-controller-manager-d47688694-gtlps\" (UID: \"45efd8ce-26db-4511-bd88-2e7467d02bbb\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.988758 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5rn7\" (UniqueName: \"kubernetes.io/projected/3c955ebc-98fd-4921-9923-6151a50e8eec-kube-api-access-g5rn7\") pod \"designate-operator-controller-manager-66d56f6ff4-4n5rx\" (UID: \"3c955ebc-98fd-4921-9923-6151a50e8eec\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.003324 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l2jg\" (UniqueName: \"kubernetes.io/projected/0d88a5d2-a852-409e-b4bd-939d1c2b9090-kube-api-access-4l2jg\") pod \"cinder-operator-controller-manager-984cd4dcf-p9d5v\" (UID: \"0d88a5d2-a852-409e-b4bd-939d1c2b9090\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.032084 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hnd7\" (UniqueName: \"kubernetes.io/projected/c35de09d-7f21-47d3-aac5-a26b15b0a496-kube-api-access-7hnd7\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.032144 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.032189 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r94zq\" (UniqueName: \"kubernetes.io/projected/fb7b2f97-fca8-41d2-9be7-d40fac94c171-kube-api-access-r94zq\") pod \"glance-operator-controller-manager-5964f64c48-mf8h6\" (UID: \"fb7b2f97-fca8-41d2-9be7-d40fac94c171\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.032207 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbcwc\" (UniqueName: \"kubernetes.io/projected/a80d01d5-0201-4b2e-974c-ac5b42ac8df4-kube-api-access-jbcwc\") pod \"horizon-operator-controller-manager-6d9d6b584d-jngrl\" (UID: \"a80d01d5-0201-4b2e-974c-ac5b42ac8df4\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.032233 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xlhk\" (UniqueName: \"kubernetes.io/projected/ea0ad033-9a48-4e42-a237-f27cacf03adc-kube-api-access-8xlhk\") pod \"heat-operator-controller-manager-77b6666d85-tqp4b\" (UID: \"ea0ad033-9a48-4e42-a237-f27cacf03adc\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.032848 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.033175 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.039097 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.042797 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.046799 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.049283 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-z4khd" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.071461 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.076614 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r94zq\" (UniqueName: \"kubernetes.io/projected/fb7b2f97-fca8-41d2-9be7-d40fac94c171-kube-api-access-r94zq\") pod \"glance-operator-controller-manager-5964f64c48-mf8h6\" (UID: \"fb7b2f97-fca8-41d2-9be7-d40fac94c171\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.082170 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.084454 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xlhk\" (UniqueName: \"kubernetes.io/projected/ea0ad033-9a48-4e42-a237-f27cacf03adc-kube-api-access-8xlhk\") pod \"heat-operator-controller-manager-77b6666d85-tqp4b\" (UID: \"ea0ad033-9a48-4e42-a237-f27cacf03adc\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.093229 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.094258 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.097188 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-s92xm" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.097705 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.107821 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.109204 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.115812 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-lzkkk" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.126650 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.129223 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.131850 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-z2klm" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.137323 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.137988 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hnd7\" (UniqueName: \"kubernetes.io/projected/c35de09d-7f21-47d3-aac5-a26b15b0a496-kube-api-access-7hnd7\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.138592 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.138649 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbcwc\" (UniqueName: \"kubernetes.io/projected/a80d01d5-0201-4b2e-974c-ac5b42ac8df4-kube-api-access-jbcwc\") pod \"horizon-operator-controller-manager-6d9d6b584d-jngrl\" (UID: \"a80d01d5-0201-4b2e-974c-ac5b42ac8df4\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 14:17:03 crc kubenswrapper[4898]: E0313 14:17:03.139003 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:03 crc kubenswrapper[4898]: E0313 14:17:03.139051 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert podName:c35de09d-7f21-47d3-aac5-a26b15b0a496 nodeName:}" failed. No retries permitted until 2026-03-13 14:17:03.639035094 +0000 UTC m=+1258.640623333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8kcsw" (UID: "c35de09d-7f21-47d3-aac5-a26b15b0a496") : secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.145202 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.157234 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbcwc\" (UniqueName: \"kubernetes.io/projected/a80d01d5-0201-4b2e-974c-ac5b42ac8df4-kube-api-access-jbcwc\") pod \"horizon-operator-controller-manager-6d9d6b584d-jngrl\" (UID: \"a80d01d5-0201-4b2e-974c-ac5b42ac8df4\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.161828 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hnd7\" (UniqueName: \"kubernetes.io/projected/c35de09d-7f21-47d3-aac5-a26b15b0a496-kube-api-access-7hnd7\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.165323 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.194078 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.195547 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.199107 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-46qzh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.207411 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.230173 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.231339 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.239342 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-hh82n" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.241034 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ghts\" (UniqueName: \"kubernetes.io/projected/32b5ebfd-38d9-456e-bb21-7332323239d1-kube-api-access-9ghts\") pod \"ironic-operator-controller-manager-5bc894d9b-v99bm\" (UID: \"32b5ebfd-38d9-456e-bb21-7332323239d1\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.241129 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c59lv\" (UniqueName: \"kubernetes.io/projected/d24bb749-0b71-456b-80e4-fdf6dd23ba30-kube-api-access-c59lv\") pod \"keystone-operator-controller-manager-684f77d66d-s5zh6\" (UID: \"d24bb749-0b71-456b-80e4-fdf6dd23ba30\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.241198 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q57rr\" (UniqueName: \"kubernetes.io/projected/ba56f415-73d5-4301-a25d-0e5d1ba4e3b1-kube-api-access-q57rr\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc\" (UID: \"ba56f415-73d5-4301-a25d-0e5d1ba4e3b1\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.241224 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnvxx\" (UniqueName: \"kubernetes.io/projected/1df4a7d6-b0c2-4b00-b591-1a612bd319b6-kube-api-access-wnvxx\") pod \"manila-operator-controller-manager-57b484b4df-z2gd2\" (UID: \"1df4a7d6-b0c2-4b00-b591-1a612bd319b6\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.263425 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.282045 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.287745 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.336471 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.343282 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c59lv\" (UniqueName: \"kubernetes.io/projected/d24bb749-0b71-456b-80e4-fdf6dd23ba30-kube-api-access-c59lv\") pod \"keystone-operator-controller-manager-684f77d66d-s5zh6\" (UID: \"d24bb749-0b71-456b-80e4-fdf6dd23ba30\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.343361 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q57rr\" (UniqueName: \"kubernetes.io/projected/ba56f415-73d5-4301-a25d-0e5d1ba4e3b1-kube-api-access-q57rr\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc\" (UID: \"ba56f415-73d5-4301-a25d-0e5d1ba4e3b1\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.343385 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnvxx\" (UniqueName: \"kubernetes.io/projected/1df4a7d6-b0c2-4b00-b591-1a612bd319b6-kube-api-access-wnvxx\") pod \"manila-operator-controller-manager-57b484b4df-z2gd2\" (UID: \"1df4a7d6-b0c2-4b00-b591-1a612bd319b6\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.343470 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbj2m\" (UniqueName: \"kubernetes.io/projected/52959483-daae-423a-a3bf-8e3fa7810074-kube-api-access-wbj2m\") pod \"nova-operator-controller-manager-7f84474648-mr4wv\" (UID: \"52959483-daae-423a-a3bf-8e3fa7810074\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.343523 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btrh6\" (UniqueName: \"kubernetes.io/projected/d71982c0-a3d0-4da8-84cd-7494301f589f-kube-api-access-btrh6\") pod \"neutron-operator-controller-manager-776c5696bf-ntlw6\" (UID: \"d71982c0-a3d0-4da8-84cd-7494301f589f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.343550 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ghts\" (UniqueName: \"kubernetes.io/projected/32b5ebfd-38d9-456e-bb21-7332323239d1-kube-api-access-9ghts\") pod \"ironic-operator-controller-manager-5bc894d9b-v99bm\" (UID: \"32b5ebfd-38d9-456e-bb21-7332323239d1\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.344962 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.345641 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.347090 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-ldprr" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.398798 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.401936 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.404669 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-x5ln5" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.418163 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ghts\" (UniqueName: \"kubernetes.io/projected/32b5ebfd-38d9-456e-bb21-7332323239d1-kube-api-access-9ghts\") pod \"ironic-operator-controller-manager-5bc894d9b-v99bm\" (UID: \"32b5ebfd-38d9-456e-bb21-7332323239d1\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.425029 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.435125 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q57rr\" (UniqueName: \"kubernetes.io/projected/ba56f415-73d5-4301-a25d-0e5d1ba4e3b1-kube-api-access-q57rr\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc\" (UID: \"ba56f415-73d5-4301-a25d-0e5d1ba4e3b1\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.435490 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnvxx\" (UniqueName: \"kubernetes.io/projected/1df4a7d6-b0c2-4b00-b591-1a612bd319b6-kube-api-access-wnvxx\") pod \"manila-operator-controller-manager-57b484b4df-z2gd2\" (UID: \"1df4a7d6-b0c2-4b00-b591-1a612bd319b6\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.436066 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c59lv\" (UniqueName: \"kubernetes.io/projected/d24bb749-0b71-456b-80e4-fdf6dd23ba30-kube-api-access-c59lv\") pod \"keystone-operator-controller-manager-684f77d66d-s5zh6\" (UID: \"d24bb749-0b71-456b-80e4-fdf6dd23ba30\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.440209 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.443968 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.450877 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ddhpq" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.456662 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52gjb\" (UniqueName: \"kubernetes.io/projected/d29ce3ee-3d5a-4801-abf9-dfef5b641a74-kube-api-access-52gjb\") pod \"octavia-operator-controller-manager-5f4f55cb5c-s2rdh\" (UID: \"d29ce3ee-3d5a-4801-abf9-dfef5b641a74\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.457008 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbj2m\" (UniqueName: \"kubernetes.io/projected/52959483-daae-423a-a3bf-8e3fa7810074-kube-api-access-wbj2m\") pod \"nova-operator-controller-manager-7f84474648-mr4wv\" (UID: \"52959483-daae-423a-a3bf-8e3fa7810074\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.457241 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btrh6\" (UniqueName: \"kubernetes.io/projected/d71982c0-a3d0-4da8-84cd-7494301f589f-kube-api-access-btrh6\") pod \"neutron-operator-controller-manager-776c5696bf-ntlw6\" (UID: \"d71982c0-a3d0-4da8-84cd-7494301f589f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.458957 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.460527 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.463726 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.466347 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-njg9g" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.478216 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.486932 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.489387 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbj2m\" (UniqueName: \"kubernetes.io/projected/52959483-daae-423a-a3bf-8e3fa7810074-kube-api-access-wbj2m\") pod \"nova-operator-controller-manager-7f84474648-mr4wv\" (UID: \"52959483-daae-423a-a3bf-8e3fa7810074\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.492718 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.493740 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btrh6\" (UniqueName: \"kubernetes.io/projected/d71982c0-a3d0-4da8-84cd-7494301f589f-kube-api-access-btrh6\") pod \"neutron-operator-controller-manager-776c5696bf-ntlw6\" (UID: \"d71982c0-a3d0-4da8-84cd-7494301f589f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.494524 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.500931 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.501439 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-dn2hz" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.505012 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.525071 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.527652 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.536166 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.542711 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.545819 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.555846 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.557392 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.558989 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb2ck\" (UniqueName: \"kubernetes.io/projected/da3795a7-363f-4637-afe2-77cb77248f9a-kube-api-access-qb2ck\") pod \"ovn-operator-controller-manager-bbc5b68f9-wdmrh\" (UID: \"da3795a7-363f-4637-afe2-77cb77248f9a\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.559087 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.559181 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tdqh\" (UniqueName: \"kubernetes.io/projected/0ab852e1-fd26-4f76-b758-77896f8e236b-kube-api-access-7tdqh\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.559227 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52gjb\" (UniqueName: \"kubernetes.io/projected/d29ce3ee-3d5a-4801-abf9-dfef5b641a74-kube-api-access-52gjb\") pod \"octavia-operator-controller-manager-5f4f55cb5c-s2rdh\" (UID: \"d29ce3ee-3d5a-4801-abf9-dfef5b641a74\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.559249 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkdwz\" (UniqueName: \"kubernetes.io/projected/0d7c657b-a701-41fe-9b23-d5bba3302c4f-kube-api-access-dkdwz\") pod \"placement-operator-controller-manager-574d45c66c-njsvh\" (UID: \"0d7c657b-a701-41fe-9b23-d5bba3302c4f\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.565721 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-gvvbw" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.615800 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52gjb\" (UniqueName: \"kubernetes.io/projected/d29ce3ee-3d5a-4801-abf9-dfef5b641a74-kube-api-access-52gjb\") pod \"octavia-operator-controller-manager-5f4f55cb5c-s2rdh\" (UID: \"d29ce3ee-3d5a-4801-abf9-dfef5b641a74\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.616654 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.661590 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggn24\" (UniqueName: \"kubernetes.io/projected/9ff6f89a-7110-42fb-96b9-8611f280bebe-kube-api-access-ggn24\") pod \"telemetry-operator-controller-manager-5b9fbd87f-s2k96\" (UID: \"9ff6f89a-7110-42fb-96b9-8611f280bebe\") " pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.661642 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbspp\" (UniqueName: \"kubernetes.io/projected/66a86c31-9ff3-439a-a0f8-96c981014b6f-kube-api-access-kbspp\") pod \"swift-operator-controller-manager-7f9cc5dd44-f2t6t\" (UID: \"66a86c31-9ff3-439a-a0f8-96c981014b6f\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.661667 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.662230 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:03 crc kubenswrapper[4898]: E0313 14:17:03.662326 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.662437 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tdqh\" (UniqueName: \"kubernetes.io/projected/0ab852e1-fd26-4f76-b758-77896f8e236b-kube-api-access-7tdqh\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:03 crc kubenswrapper[4898]: E0313 14:17:03.662444 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert podName:c35de09d-7f21-47d3-aac5-a26b15b0a496 nodeName:}" failed. No retries permitted until 2026-03-13 14:17:04.662427386 +0000 UTC m=+1259.664015625 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8kcsw" (UID: "c35de09d-7f21-47d3-aac5-a26b15b0a496") : secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.662539 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkdwz\" (UniqueName: \"kubernetes.io/projected/0d7c657b-a701-41fe-9b23-d5bba3302c4f-kube-api-access-dkdwz\") pod \"placement-operator-controller-manager-574d45c66c-njsvh\" (UID: \"0d7c657b-a701-41fe-9b23-d5bba3302c4f\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.662639 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb2ck\" (UniqueName: \"kubernetes.io/projected/da3795a7-363f-4637-afe2-77cb77248f9a-kube-api-access-qb2ck\") pod \"ovn-operator-controller-manager-bbc5b68f9-wdmrh\" (UID: \"da3795a7-363f-4637-afe2-77cb77248f9a\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" Mar 13 14:17:03 crc kubenswrapper[4898]: E0313 14:17:03.662785 4898 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:03 crc kubenswrapper[4898]: E0313 14:17:03.662813 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert podName:0ab852e1-fd26-4f76-b758-77896f8e236b nodeName:}" failed. No retries permitted until 2026-03-13 14:17:04.162805676 +0000 UTC m=+1259.164393915 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" (UID: "0ab852e1-fd26-4f76-b758-77896f8e236b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.688702 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb2ck\" (UniqueName: \"kubernetes.io/projected/da3795a7-363f-4637-afe2-77cb77248f9a-kube-api-access-qb2ck\") pod \"ovn-operator-controller-manager-bbc5b68f9-wdmrh\" (UID: \"da3795a7-363f-4637-afe2-77cb77248f9a\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.689798 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tdqh\" (UniqueName: \"kubernetes.io/projected/0ab852e1-fd26-4f76-b758-77896f8e236b-kube-api-access-7tdqh\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.694005 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkdwz\" (UniqueName: \"kubernetes.io/projected/0d7c657b-a701-41fe-9b23-d5bba3302c4f-kube-api-access-dkdwz\") pod \"placement-operator-controller-manager-574d45c66c-njsvh\" (UID: \"0d7c657b-a701-41fe-9b23-d5bba3302c4f\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.694472 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.695484 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.698442 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-hd4hf" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.709667 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.757782 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.759659 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.764228 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-gnhm7" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.765126 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggn24\" (UniqueName: \"kubernetes.io/projected/9ff6f89a-7110-42fb-96b9-8611f280bebe-kube-api-access-ggn24\") pod \"telemetry-operator-controller-manager-5b9fbd87f-s2k96\" (UID: \"9ff6f89a-7110-42fb-96b9-8611f280bebe\") " pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.765176 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbspp\" (UniqueName: \"kubernetes.io/projected/66a86c31-9ff3-439a-a0f8-96c981014b6f-kube-api-access-kbspp\") pod \"swift-operator-controller-manager-7f9cc5dd44-f2t6t\" (UID: \"66a86c31-9ff3-439a-a0f8-96c981014b6f\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.786735 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggn24\" (UniqueName: \"kubernetes.io/projected/9ff6f89a-7110-42fb-96b9-8611f280bebe-kube-api-access-ggn24\") pod \"telemetry-operator-controller-manager-5b9fbd87f-s2k96\" (UID: \"9ff6f89a-7110-42fb-96b9-8611f280bebe\") " pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.791102 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbspp\" (UniqueName: \"kubernetes.io/projected/66a86c31-9ff3-439a-a0f8-96c981014b6f-kube-api-access-kbspp\") pod \"swift-operator-controller-manager-7f9cc5dd44-f2t6t\" (UID: \"66a86c31-9ff3-439a-a0f8-96c981014b6f\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.804019 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.830595 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.832192 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.840914 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.843026 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.843254 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nrmkf" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.859582 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.868198 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz8wp\" (UniqueName: \"kubernetes.io/projected/19a0f4de-5258-4f2b-9587-71293459378e-kube-api-access-rz8wp\") pod \"test-operator-controller-manager-5c5cb9c4d7-smdkt\" (UID: \"19a0f4de-5258-4f2b-9587-71293459378e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.868498 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvjvw\" (UniqueName: \"kubernetes.io/projected/919747b8-a031-4654-999f-3c3928f981b4-kube-api-access-xvjvw\") pod \"watcher-operator-controller-manager-6c4d75f7f9-jwrd2\" (UID: \"919747b8-a031-4654-999f-3c3928f981b4\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.880523 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.902410 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.904168 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.909410 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-j47dn" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.916037 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.928971 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.951298 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.970547 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.974815 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvjvw\" (UniqueName: \"kubernetes.io/projected/919747b8-a031-4654-999f-3c3928f981b4-kube-api-access-xvjvw\") pod \"watcher-operator-controller-manager-6c4d75f7f9-jwrd2\" (UID: \"919747b8-a031-4654-999f-3c3928f981b4\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.975009 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdbxx\" (UniqueName: \"kubernetes.io/projected/3a26728d-85c2-465c-bce4-c74045ea9e0d-kube-api-access-jdbxx\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.975037 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.976287 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz8wp\" (UniqueName: \"kubernetes.io/projected/19a0f4de-5258-4f2b-9587-71293459378e-kube-api-access-rz8wp\") pod \"test-operator-controller-manager-5c5cb9c4d7-smdkt\" (UID: \"19a0f4de-5258-4f2b-9587-71293459378e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.976340 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.995484 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.008055 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz8wp\" (UniqueName: \"kubernetes.io/projected/19a0f4de-5258-4f2b-9587-71293459378e-kube-api-access-rz8wp\") pod \"test-operator-controller-manager-5c5cb9c4d7-smdkt\" (UID: \"19a0f4de-5258-4f2b-9587-71293459378e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.011059 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvjvw\" (UniqueName: \"kubernetes.io/projected/919747b8-a031-4654-999f-3c3928f981b4-kube-api-access-xvjvw\") pod \"watcher-operator-controller-manager-6c4d75f7f9-jwrd2\" (UID: \"919747b8-a031-4654-999f-3c3928f981b4\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.023979 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.046641 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v"] Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.054427 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx"] Mar 13 14:17:04 crc kubenswrapper[4898]: W0313 14:17:04.059132 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d88a5d2_a852_409e_b4bd_939d1c2b9090.slice/crio-793122426ae30c4ff049dfb616beca97752fa5b76927a883ab6fcfb0469dcf21 WatchSource:0}: Error finding container 793122426ae30c4ff049dfb616beca97752fa5b76927a883ab6fcfb0469dcf21: Status 404 returned error can't find the container with id 793122426ae30c4ff049dfb616beca97752fa5b76927a883ab6fcfb0469dcf21 Mar 13 14:17:04 crc kubenswrapper[4898]: W0313 14:17:04.068486 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c955ebc_98fd_4921_9923_6151a50e8eec.slice/crio-c7cc7671bd8023d7055244f8253f5d7396c3c1891269cc38baa7d6692d48b5a3 WatchSource:0}: Error finding container c7cc7671bd8023d7055244f8253f5d7396c3c1891269cc38baa7d6692d48b5a3: Status 404 returned error can't find the container with id c7cc7671bd8023d7055244f8253f5d7396c3c1891269cc38baa7d6692d48b5a3 Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.077715 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhfbv\" (UniqueName: \"kubernetes.io/projected/7b9c0413-5558-43c4-805b-7f035fded9b4-kube-api-access-zhfbv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-82gtc\" (UID: \"7b9c0413-5558-43c4-805b-7f035fded9b4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.077992 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdbxx\" (UniqueName: \"kubernetes.io/projected/3a26728d-85c2-465c-bce4-c74045ea9e0d-kube-api-access-jdbxx\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.078089 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.078298 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.078562 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.078668 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:04.578653863 +0000 UTC m=+1259.580242102 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "webhook-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.078832 4898 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.078883 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:04.578867628 +0000 UTC m=+1259.580455867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "metrics-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.087884 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.111078 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdbxx\" (UniqueName: \"kubernetes.io/projected/3a26728d-85c2-465c-bce4-c74045ea9e0d-kube-api-access-jdbxx\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.180466 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhfbv\" (UniqueName: \"kubernetes.io/projected/7b9c0413-5558-43c4-805b-7f035fded9b4-kube-api-access-zhfbv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-82gtc\" (UID: \"7b9c0413-5558-43c4-805b-7f035fded9b4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.180643 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.180926 4898 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.181022 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert podName:0ab852e1-fd26-4f76-b758-77896f8e236b nodeName:}" failed. No retries permitted until 2026-03-13 14:17:05.181001893 +0000 UTC m=+1260.182590132 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" (UID: "0ab852e1-fd26-4f76-b758-77896f8e236b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.208284 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhfbv\" (UniqueName: \"kubernetes.io/projected/7b9c0413-5558-43c4-805b-7f035fded9b4-kube-api-access-zhfbv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-82gtc\" (UID: \"7b9c0413-5558-43c4-805b-7f035fded9b4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.268164 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.587813 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.588212 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.588259 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.588332 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:05.588304317 +0000 UTC m=+1260.589892556 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "webhook-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.588391 4898 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.588462 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:05.58844598 +0000 UTC m=+1260.590034319 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "metrics-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.690823 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.691015 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.691099 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert podName:c35de09d-7f21-47d3-aac5-a26b15b0a496 nodeName:}" failed. No retries permitted until 2026-03-13 14:17:06.691078577 +0000 UTC m=+1261.692666816 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8kcsw" (UID: "c35de09d-7f21-47d3-aac5-a26b15b0a496") : secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.692472 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6"] Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.732734 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b"] Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.745029 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl"] Mar 13 14:17:04 crc kubenswrapper[4898]: W0313 14:17:04.750237 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45efd8ce_26db_4511_bd88_2e7467d02bbb.slice/crio-728f77daeb1e6f5737552fe481b2f6f36d274a3ad518796f0d7bf774913f038d WatchSource:0}: Error finding container 728f77daeb1e6f5737552fe481b2f6f36d274a3ad518796f0d7bf774913f038d: Status 404 returned error can't find the container with id 728f77daeb1e6f5737552fe481b2f6f36d274a3ad518796f0d7bf774913f038d Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.753830 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm"] Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.760059 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-gtlps"] Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.821516 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" event={"ID":"fb7b2f97-fca8-41d2-9be7-d40fac94c171","Type":"ContainerStarted","Data":"14b49b36578e5baad4dcc89f37c936b6990ccc3cee155f4075ee27ce85de2cef"} Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.822594 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" event={"ID":"3c955ebc-98fd-4921-9923-6151a50e8eec","Type":"ContainerStarted","Data":"c7cc7671bd8023d7055244f8253f5d7396c3c1891269cc38baa7d6692d48b5a3"} Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.823424 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" event={"ID":"ea0ad033-9a48-4e42-a237-f27cacf03adc","Type":"ContainerStarted","Data":"9d98915ef6bf8115270995879993f6b49b8ed969160205e354e2840e366f0fa7"} Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.829741 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" event={"ID":"32b5ebfd-38d9-456e-bb21-7332323239d1","Type":"ContainerStarted","Data":"b89241a610af52b17b7373b99bdbd766d6b800dcf44a04d41a8ec184f323a9e8"} Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.830856 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" event={"ID":"0d88a5d2-a852-409e-b4bd-939d1c2b9090","Type":"ContainerStarted","Data":"793122426ae30c4ff049dfb616beca97752fa5b76927a883ab6fcfb0469dcf21"} Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.833095 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" event={"ID":"a80d01d5-0201-4b2e-974c-ac5b42ac8df4","Type":"ContainerStarted","Data":"0f4f488da805040bcc6927802f79c7733f3f68ce78f0f8dfc9c29899f39de99c"} Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.837167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" event={"ID":"45efd8ce-26db-4511-bd88-2e7467d02bbb","Type":"ContainerStarted","Data":"728f77daeb1e6f5737552fe481b2f6f36d274a3ad518796f0d7bf774913f038d"} Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.905842 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.056342 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.064671 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.204436 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.204702 4898 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.204873 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert podName:0ab852e1-fd26-4f76-b758-77896f8e236b nodeName:}" failed. No retries permitted until 2026-03-13 14:17:07.204857859 +0000 UTC m=+1262.206446098 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" (UID: "0ab852e1-fd26-4f76-b758-77896f8e236b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.583124 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.599211 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.613566 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.613656 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.613881 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.613948 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:07.613932979 +0000 UTC m=+1262.615521208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "webhook-server-cert" not found Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.614244 4898 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.614273 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:07.614266298 +0000 UTC m=+1262.615854537 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "metrics-server-cert" not found Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.614846 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.625477 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh"] Mar 13 14:17:05 crc kubenswrapper[4898]: W0313 14:17:05.638591 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda3795a7_363f_4637_afe2_77cb77248f9a.slice/crio-ca4ecd792b4282a6e252f8ba184b93785f2c8932c7847b41ed1d68d038d89480 WatchSource:0}: Error finding container ca4ecd792b4282a6e252f8ba184b93785f2c8932c7847b41ed1d68d038d89480: Status 404 returned error can't find the container with id ca4ecd792b4282a6e252f8ba184b93785f2c8932c7847b41ed1d68d038d89480 Mar 13 14:17:05 crc kubenswrapper[4898]: W0313 14:17:05.642810 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d7c657b_a701_41fe_9b23_d5bba3302c4f.slice/crio-437e96d1fa812859f5d447beb3509139da83bfb5b6ddd54dc43c69a3b2c8f4e6 WatchSource:0}: Error finding container 437e96d1fa812859f5d447beb3509139da83bfb5b6ddd54dc43c69a3b2c8f4e6: Status 404 returned error can't find the container with id 437e96d1fa812859f5d447beb3509139da83bfb5b6ddd54dc43c69a3b2c8f4e6 Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.650638 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh"] Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.665237 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xvjvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-jwrd2_openstack-operators(919747b8-a031-4654-999f-3c3928f981b4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.666854 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podUID="919747b8-a031-4654-999f-3c3928f981b4" Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.668871 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-52gjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5f4f55cb5c-s2rdh_openstack-operators(d29ce3ee-3d5a-4801-abf9-dfef5b641a74): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.670362 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podUID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.671698 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.705111 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.726294 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.738909 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2"] Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.756173 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zhfbv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-82gtc_openstack-operators(7b9c0413-5558-43c4-805b-7f035fded9b4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.757391 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" podUID="7b9c0413-5558-43c4-805b-7f035fded9b4" Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.771945 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.864693 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" event={"ID":"d24bb749-0b71-456b-80e4-fdf6dd23ba30","Type":"ContainerStarted","Data":"52f3b126ec515a3b99aa74016189fe85b1a3830ec642b33f3af093d4ce2d1dd0"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.867903 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" event={"ID":"0d7c657b-a701-41fe-9b23-d5bba3302c4f","Type":"ContainerStarted","Data":"437e96d1fa812859f5d447beb3509139da83bfb5b6ddd54dc43c69a3b2c8f4e6"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.874960 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" event={"ID":"19a0f4de-5258-4f2b-9587-71293459378e","Type":"ContainerStarted","Data":"2485923817c4aa5b2796763c3dead67f36e78a34bcf659b5e705d6ec6d42c8a9"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.876246 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" event={"ID":"d71982c0-a3d0-4da8-84cd-7494301f589f","Type":"ContainerStarted","Data":"798dad54b98acf66ff986a32d06d487085a6511f7ca8d4ffa80ba98da7f2b774"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.878472 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" event={"ID":"919747b8-a031-4654-999f-3c3928f981b4","Type":"ContainerStarted","Data":"4a2fc19a56feb3383ce71da15b91814fc4fb03e0ff94318935a8623ac8f716c3"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.879298 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" event={"ID":"52959483-daae-423a-a3bf-8e3fa7810074","Type":"ContainerStarted","Data":"0501e5bb2341712296e0e6d55c1011023e3dae7005e2b71f056e627bf94f85d8"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.880236 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" event={"ID":"9ff6f89a-7110-42fb-96b9-8611f280bebe","Type":"ContainerStarted","Data":"dcf23a3a877f3bc4fa451bfcbbcab4b79c44e506a6525ba6b9de798d32828221"} Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.880512 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podUID="919747b8-a031-4654-999f-3c3928f981b4" Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.882461 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" event={"ID":"da3795a7-363f-4637-afe2-77cb77248f9a","Type":"ContainerStarted","Data":"ca4ecd792b4282a6e252f8ba184b93785f2c8932c7847b41ed1d68d038d89480"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.892971 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" event={"ID":"d29ce3ee-3d5a-4801-abf9-dfef5b641a74","Type":"ContainerStarted","Data":"61ee13192b8406ac5d2138d04ae263f68dc8e87d024f1c72264b4c698a929098"} Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.899161 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podUID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.907310 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" event={"ID":"ba56f415-73d5-4301-a25d-0e5d1ba4e3b1","Type":"ContainerStarted","Data":"1eb6a5434eeb176850a728d7ccb5a75898f2d1470a7a23104537baefca873452"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.914679 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" event={"ID":"1df4a7d6-b0c2-4b00-b591-1a612bd319b6","Type":"ContainerStarted","Data":"68617652a567261ca08445a27c0e0b16c6c3ae01dec4be2a59e6f76dd4ba045b"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.933383 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" event={"ID":"66a86c31-9ff3-439a-a0f8-96c981014b6f","Type":"ContainerStarted","Data":"b80f998b1054ca04bf2753585897e957674d0da0803a2bdb9e0363c50019795b"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.945446 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" event={"ID":"7b9c0413-5558-43c4-805b-7f035fded9b4","Type":"ContainerStarted","Data":"31208be09479c4b377365ac6f5a4e48f629b1c363f99d79e2adea0deb4658260"} Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.953974 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" podUID="7b9c0413-5558-43c4-805b-7f035fded9b4" Mar 13 14:17:06 crc kubenswrapper[4898]: I0313 14:17:06.750864 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:06 crc kubenswrapper[4898]: E0313 14:17:06.751039 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:06 crc kubenswrapper[4898]: E0313 14:17:06.751113 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert podName:c35de09d-7f21-47d3-aac5-a26b15b0a496 nodeName:}" failed. No retries permitted until 2026-03-13 14:17:10.75109632 +0000 UTC m=+1265.752684559 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8kcsw" (UID: "c35de09d-7f21-47d3-aac5-a26b15b0a496") : secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:06 crc kubenswrapper[4898]: E0313 14:17:06.961458 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" podUID="7b9c0413-5558-43c4-805b-7f035fded9b4" Mar 13 14:17:06 crc kubenswrapper[4898]: E0313 14:17:06.961470 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podUID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" Mar 13 14:17:06 crc kubenswrapper[4898]: E0313 14:17:06.961548 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podUID="919747b8-a031-4654-999f-3c3928f981b4" Mar 13 14:17:07 crc kubenswrapper[4898]: I0313 14:17:07.260272 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:07 crc kubenswrapper[4898]: E0313 14:17:07.260648 4898 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:07 crc kubenswrapper[4898]: E0313 14:17:07.260774 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert podName:0ab852e1-fd26-4f76-b758-77896f8e236b nodeName:}" failed. No retries permitted until 2026-03-13 14:17:11.260749014 +0000 UTC m=+1266.262337273 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" (UID: "0ab852e1-fd26-4f76-b758-77896f8e236b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:07 crc kubenswrapper[4898]: I0313 14:17:07.668570 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:07 crc kubenswrapper[4898]: I0313 14:17:07.668652 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:07 crc kubenswrapper[4898]: E0313 14:17:07.668738 4898 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 14:17:07 crc kubenswrapper[4898]: E0313 14:17:07.668828 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:11.668810288 +0000 UTC m=+1266.670398527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "metrics-server-cert" not found Mar 13 14:17:07 crc kubenswrapper[4898]: E0313 14:17:07.668982 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 14:17:07 crc kubenswrapper[4898]: E0313 14:17:07.669063 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:11.669044914 +0000 UTC m=+1266.670633153 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "webhook-server-cert" not found Mar 13 14:17:10 crc kubenswrapper[4898]: I0313 14:17:10.825593 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:10 crc kubenswrapper[4898]: E0313 14:17:10.825772 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:10 crc kubenswrapper[4898]: E0313 14:17:10.826040 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert podName:c35de09d-7f21-47d3-aac5-a26b15b0a496 nodeName:}" failed. No retries permitted until 2026-03-13 14:17:18.82601645 +0000 UTC m=+1273.827604699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8kcsw" (UID: "c35de09d-7f21-47d3-aac5-a26b15b0a496") : secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:11 crc kubenswrapper[4898]: I0313 14:17:11.347936 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:11 crc kubenswrapper[4898]: E0313 14:17:11.348107 4898 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:11 crc kubenswrapper[4898]: E0313 14:17:11.348179 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert podName:0ab852e1-fd26-4f76-b758-77896f8e236b nodeName:}" failed. No retries permitted until 2026-03-13 14:17:19.34816192 +0000 UTC m=+1274.349750149 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" (UID: "0ab852e1-fd26-4f76-b758-77896f8e236b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:11 crc kubenswrapper[4898]: I0313 14:17:11.755970 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:11 crc kubenswrapper[4898]: I0313 14:17:11.756308 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:11 crc kubenswrapper[4898]: E0313 14:17:11.756120 4898 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 14:17:11 crc kubenswrapper[4898]: E0313 14:17:11.756626 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:19.756600314 +0000 UTC m=+1274.758188593 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "metrics-server-cert" not found Mar 13 14:17:11 crc kubenswrapper[4898]: E0313 14:17:11.756429 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 14:17:11 crc kubenswrapper[4898]: E0313 14:17:11.757042 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:19.757025285 +0000 UTC m=+1274.758613544 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "webhook-server-cert" not found Mar 13 14:17:18 crc kubenswrapper[4898]: I0313 14:17:18.902441 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:18 crc kubenswrapper[4898]: E0313 14:17:18.903079 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:18 crc kubenswrapper[4898]: E0313 14:17:18.904151 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert podName:c35de09d-7f21-47d3-aac5-a26b15b0a496 nodeName:}" failed. No retries permitted until 2026-03-13 14:17:34.904128249 +0000 UTC m=+1289.905716488 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8kcsw" (UID: "c35de09d-7f21-47d3-aac5-a26b15b0a496") : secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:19 crc kubenswrapper[4898]: I0313 14:17:19.413512 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:19 crc kubenswrapper[4898]: I0313 14:17:19.418858 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:19 crc kubenswrapper[4898]: I0313 14:17:19.519649 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-x5ln5" Mar 13 14:17:19 crc kubenswrapper[4898]: I0313 14:17:19.527684 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:19 crc kubenswrapper[4898]: E0313 14:17:19.715017 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:d9bffb59bb7f9f0a6cb103c3986fd2c1bdb13ce6349c39427a690858cbd754d6" Mar 13 14:17:19 crc kubenswrapper[4898]: E0313 14:17:19.715341 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:d9bffb59bb7f9f0a6cb103c3986fd2c1bdb13ce6349c39427a690858cbd754d6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jbcwc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-6d9d6b584d-jngrl_openstack-operators(a80d01d5-0201-4b2e-974c-ac5b42ac8df4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:19 crc kubenswrapper[4898]: E0313 14:17:19.716493 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podUID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" Mar 13 14:17:19 crc kubenswrapper[4898]: I0313 14:17:19.821222 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:19 crc kubenswrapper[4898]: I0313 14:17:19.821475 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:19 crc kubenswrapper[4898]: E0313 14:17:19.821749 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 14:17:19 crc kubenswrapper[4898]: E0313 14:17:19.821881 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:35.821863977 +0000 UTC m=+1290.823452226 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "webhook-server-cert" not found Mar 13 14:17:19 crc kubenswrapper[4898]: I0313 14:17:19.829821 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:20 crc kubenswrapper[4898]: E0313 14:17:20.109527 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:d9bffb59bb7f9f0a6cb103c3986fd2c1bdb13ce6349c39427a690858cbd754d6\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podUID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" Mar 13 14:17:20 crc kubenswrapper[4898]: E0313 14:17:20.652862 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42" Mar 13 14:17:20 crc kubenswrapper[4898]: E0313 14:17:20.653349 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rz8wp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-smdkt_openstack-operators(19a0f4de-5258-4f2b-9587-71293459378e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:20 crc kubenswrapper[4898]: E0313 14:17:20.654518 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" podUID="19a0f4de-5258-4f2b-9587-71293459378e" Mar 13 14:17:21 crc kubenswrapper[4898]: E0313 14:17:21.120785 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" podUID="19a0f4de-5258-4f2b-9587-71293459378e" Mar 13 14:17:23 crc kubenswrapper[4898]: E0313 14:17:23.404271 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:7c0da25380c91ffd1940d75eaa71b6842a6a4cf4056e62d6b0d237897b74e4d9" Mar 13 14:17:23 crc kubenswrapper[4898]: E0313 14:17:23.404834 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:7c0da25380c91ffd1940d75eaa71b6842a6a4cf4056e62d6b0d237897b74e4d9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4l2jg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-984cd4dcf-p9d5v_openstack-operators(0d88a5d2-a852-409e-b4bd-939d1c2b9090): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:23 crc kubenswrapper[4898]: E0313 14:17:23.406135 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" podUID="0d88a5d2-a852-409e-b4bd-939d1c2b9090" Mar 13 14:17:24 crc kubenswrapper[4898]: E0313 14:17:24.148757 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:7c0da25380c91ffd1940d75eaa71b6842a6a4cf4056e62d6b0d237897b74e4d9\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" podUID="0d88a5d2-a852-409e-b4bd-939d1c2b9090" Mar 13 14:17:24 crc kubenswrapper[4898]: E0313 14:17:24.246581 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:a26d062af19b3bc6dc6633171f1eff8eec33e8e925465d4968a0b9a36012a7e7" Mar 13 14:17:24 crc kubenswrapper[4898]: E0313 14:17:24.246800 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a26d062af19b3bc6dc6633171f1eff8eec33e8e925465d4968a0b9a36012a7e7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q57rr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc_openstack-operators(ba56f415-73d5-4301-a25d-0e5d1ba4e3b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:24 crc kubenswrapper[4898]: E0313 14:17:24.248038 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" podUID="ba56f415-73d5-4301-a25d-0e5d1ba4e3b1" Mar 13 14:17:24 crc kubenswrapper[4898]: E0313 14:17:24.815076 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f" Mar 13 14:17:24 crc kubenswrapper[4898]: E0313 14:17:24.817160 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qb2ck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-wdmrh_openstack-operators(da3795a7-363f-4637-afe2-77cb77248f9a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:24 crc kubenswrapper[4898]: E0313 14:17:24.818412 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" podUID="da3795a7-363f-4637-afe2-77cb77248f9a" Mar 13 14:17:25 crc kubenswrapper[4898]: E0313 14:17:25.157810 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" podUID="da3795a7-363f-4637-afe2-77cb77248f9a" Mar 13 14:17:25 crc kubenswrapper[4898]: E0313 14:17:25.157881 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a26d062af19b3bc6dc6633171f1eff8eec33e8e925465d4968a0b9a36012a7e7\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" podUID="ba56f415-73d5-4301-a25d-0e5d1ba4e3b1" Mar 13 14:17:29 crc kubenswrapper[4898]: E0313 14:17:29.657395 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:6c9aef12f50be0b974f5e35b0d69303e7f7b95e6db5d41bcdb2d9d1100e921a6" Mar 13 14:17:29 crc kubenswrapper[4898]: E0313 14:17:29.659042 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:6c9aef12f50be0b974f5e35b0d69303e7f7b95e6db5d41bcdb2d9d1100e921a6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8xlhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-77b6666d85-tqp4b_openstack-operators(ea0ad033-9a48-4e42-a237-f27cacf03adc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:29 crc kubenswrapper[4898]: E0313 14:17:29.660367 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" podUID="ea0ad033-9a48-4e42-a237-f27cacf03adc" Mar 13 14:17:30 crc kubenswrapper[4898]: E0313 14:17:30.175003 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:8ccfcdb23140e93b912bb23903a7d6fafb754e30" Mar 13 14:17:30 crc kubenswrapper[4898]: E0313 14:17:30.175369 4898 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:8ccfcdb23140e93b912bb23903a7d6fafb754e30" Mar 13 14:17:30 crc kubenswrapper[4898]: E0313 14:17:30.175528 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:8ccfcdb23140e93b912bb23903a7d6fafb754e30,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ggn24,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5b9fbd87f-s2k96_openstack-operators(9ff6f89a-7110-42fb-96b9-8611f280bebe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:30 crc kubenswrapper[4898]: E0313 14:17:30.176801 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" podUID="9ff6f89a-7110-42fb-96b9-8611f280bebe" Mar 13 14:17:30 crc kubenswrapper[4898]: E0313 14:17:30.203892 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:6c9aef12f50be0b974f5e35b0d69303e7f7b95e6db5d41bcdb2d9d1100e921a6\\\"\"" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" podUID="ea0ad033-9a48-4e42-a237-f27cacf03adc" Mar 13 14:17:30 crc kubenswrapper[4898]: E0313 14:17:30.204343 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:8ccfcdb23140e93b912bb23903a7d6fafb754e30\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" podUID="9ff6f89a-7110-42fb-96b9-8611f280bebe" Mar 13 14:17:31 crc kubenswrapper[4898]: E0313 14:17:31.834811 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165" Mar 13 14:17:31 crc kubenswrapper[4898]: E0313 14:17:31.837193 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xm7pb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-d47688694-gtlps_openstack-operators(45efd8ce-26db-4511-bd88-2e7467d02bbb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:31 crc kubenswrapper[4898]: E0313 14:17:31.839742 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" podUID="45efd8ce-26db-4511-bd88-2e7467d02bbb" Mar 13 14:17:32 crc kubenswrapper[4898]: E0313 14:17:32.219976 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" podUID="45efd8ce-26db-4511-bd88-2e7467d02bbb" Mar 13 14:17:32 crc kubenswrapper[4898]: E0313 14:17:32.338386 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:dd62e104225ea255af5a32828af4c21e1dfb50fbdf35cd41d07d1326f9017a40" Mar 13 14:17:32 crc kubenswrapper[4898]: E0313 14:17:32.338596 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:dd62e104225ea255af5a32828af4c21e1dfb50fbdf35cd41d07d1326f9017a40,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wnvxx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-57b484b4df-z2gd2_openstack-operators(1df4a7d6-b0c2-4b00-b591-1a612bd319b6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:32 crc kubenswrapper[4898]: E0313 14:17:32.339846 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" podUID="1df4a7d6-b0c2-4b00-b591-1a612bd319b6" Mar 13 14:17:33 crc kubenswrapper[4898]: E0313 14:17:33.233729 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:dd62e104225ea255af5a32828af4c21e1dfb50fbdf35cd41d07d1326f9017a40\\\"\"" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" podUID="1df4a7d6-b0c2-4b00-b591-1a612bd319b6" Mar 13 14:17:34 crc kubenswrapper[4898]: E0313 14:17:34.258286 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5" Mar 13 14:17:34 crc kubenswrapper[4898]: E0313 14:17:34.258865 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kbspp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-7f9cc5dd44-f2t6t_openstack-operators(66a86c31-9ff3-439a-a0f8-96c981014b6f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:34 crc kubenswrapper[4898]: E0313 14:17:34.260093 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" podUID="66a86c31-9ff3-439a-a0f8-96c981014b6f" Mar 13 14:17:34 crc kubenswrapper[4898]: E0313 14:17:34.763859 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978" Mar 13 14:17:34 crc kubenswrapper[4898]: E0313 14:17:34.764081 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dkdwz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-574d45c66c-njsvh_openstack-operators(0d7c657b-a701-41fe-9b23-d5bba3302c4f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:34 crc kubenswrapper[4898]: E0313 14:17:34.766083 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" podUID="0d7c657b-a701-41fe-9b23-d5bba3302c4f" Mar 13 14:17:34 crc kubenswrapper[4898]: I0313 14:17:34.922287 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:34 crc kubenswrapper[4898]: I0313 14:17:34.931911 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:35 crc kubenswrapper[4898]: I0313 14:17:35.086136 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-j4b7h" Mar 13 14:17:35 crc kubenswrapper[4898]: I0313 14:17:35.095391 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:35 crc kubenswrapper[4898]: E0313 14:17:35.251433 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" podUID="0d7c657b-a701-41fe-9b23-d5bba3302c4f" Mar 13 14:17:35 crc kubenswrapper[4898]: E0313 14:17:35.252713 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" podUID="66a86c31-9ff3-439a-a0f8-96c981014b6f" Mar 13 14:17:35 crc kubenswrapper[4898]: E0313 14:17:35.350813 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721" Mar 13 14:17:35 crc kubenswrapper[4898]: E0313 14:17:35.351041 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-btrh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-776c5696bf-ntlw6_openstack-operators(d71982c0-a3d0-4da8-84cd-7494301f589f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:35 crc kubenswrapper[4898]: E0313 14:17:35.352470 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" podUID="d71982c0-a3d0-4da8-84cd-7494301f589f" Mar 13 14:17:35 crc kubenswrapper[4898]: I0313 14:17:35.835344 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:35 crc kubenswrapper[4898]: I0313 14:17:35.841553 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:35 crc kubenswrapper[4898]: I0313 14:17:35.979460 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nrmkf" Mar 13 14:17:35 crc kubenswrapper[4898]: I0313 14:17:35.987874 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:36 crc kubenswrapper[4898]: E0313 14:17:36.257848 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" podUID="d71982c0-a3d0-4da8-84cd-7494301f589f" Mar 13 14:17:38 crc kubenswrapper[4898]: E0313 14:17:38.281317 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 13 14:17:38 crc kubenswrapper[4898]: E0313 14:17:38.281505 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c59lv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-s5zh6_openstack-operators(d24bb749-0b71-456b-80e4-fdf6dd23ba30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:38 crc kubenswrapper[4898]: E0313 14:17:38.282834 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" podUID="d24bb749-0b71-456b-80e4-fdf6dd23ba30" Mar 13 14:17:39 crc kubenswrapper[4898]: E0313 14:17:39.285480 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" podUID="d24bb749-0b71-456b-80e4-fdf6dd23ba30" Mar 13 14:17:39 crc kubenswrapper[4898]: E0313 14:17:39.765760 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807" Mar 13 14:17:39 crc kubenswrapper[4898]: E0313 14:17:39.765957 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xvjvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-jwrd2_openstack-operators(919747b8-a031-4654-999f-3c3928f981b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:39 crc kubenswrapper[4898]: E0313 14:17:39.767159 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podUID="919747b8-a031-4654-999f-3c3928f981b4" Mar 13 14:17:41 crc kubenswrapper[4898]: E0313 14:17:41.851343 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571" Mar 13 14:17:41 crc kubenswrapper[4898]: E0313 14:17:41.852016 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-52gjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5f4f55cb5c-s2rdh_openstack-operators(d29ce3ee-3d5a-4801-abf9-dfef5b641a74): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:41 crc kubenswrapper[4898]: E0313 14:17:41.853868 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podUID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" Mar 13 14:17:44 crc kubenswrapper[4898]: E0313 14:17:44.014659 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff" Mar 13 14:17:44 crc kubenswrapper[4898]: E0313 14:17:44.015132 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wbj2m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7f84474648-mr4wv_openstack-operators(52959483-daae-423a-a3bf-8e3fa7810074): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:44 crc kubenswrapper[4898]: E0313 14:17:44.016479 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" podUID="52959483-daae-423a-a3bf-8e3fa7810074" Mar 13 14:17:44 crc kubenswrapper[4898]: E0313 14:17:44.335832 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" podUID="52959483-daae-423a-a3bf-8e3fa7810074" Mar 13 14:17:44 crc kubenswrapper[4898]: E0313 14:17:44.412330 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 13 14:17:44 crc kubenswrapper[4898]: E0313 14:17:44.412512 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zhfbv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-82gtc_openstack-operators(7b9c0413-5558-43c4-805b-7f035fded9b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:44 crc kubenswrapper[4898]: E0313 14:17:44.414120 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" podUID="7b9c0413-5558-43c4-805b-7f035fded9b4" Mar 13 14:17:44 crc kubenswrapper[4898]: I0313 14:17:44.906659 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv"] Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.000570 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw"] Mar 13 14:17:45 crc kubenswrapper[4898]: W0313 14:17:45.011046 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc35de09d_7f21_47d3_aac5_a26b15b0a496.slice/crio-f59d5a970d2c0db00459d702f37b694630a1f815ef58dbe10d228ce86e6ff471 WatchSource:0}: Error finding container f59d5a970d2c0db00459d702f37b694630a1f815ef58dbe10d228ce86e6ff471: Status 404 returned error can't find the container with id f59d5a970d2c0db00459d702f37b694630a1f815ef58dbe10d228ce86e6ff471 Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.043318 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh"] Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.329667 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" event={"ID":"0d88a5d2-a852-409e-b4bd-939d1c2b9090","Type":"ContainerStarted","Data":"9476f49672453c043034bf2b4fc2b059e64c5120f767076fe3286fc76332b438"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.330150 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.332888 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" event={"ID":"c35de09d-7f21-47d3-aac5-a26b15b0a496","Type":"ContainerStarted","Data":"f59d5a970d2c0db00459d702f37b694630a1f815ef58dbe10d228ce86e6ff471"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.334593 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" event={"ID":"ea0ad033-9a48-4e42-a237-f27cacf03adc","Type":"ContainerStarted","Data":"4ad4efd883403cf99f4ae05d9e5069d476891625501a47b507ecb25aaa59279a"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.335344 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.336557 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" event={"ID":"ba56f415-73d5-4301-a25d-0e5d1ba4e3b1","Type":"ContainerStarted","Data":"7779ff9639d753fc0b8d37fac6f107546206f5932652761cf443e5906d099ade"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.336933 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.338063 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" event={"ID":"32b5ebfd-38d9-456e-bb21-7332323239d1","Type":"ContainerStarted","Data":"cf80e2daa36139cd410b53ece19d163d97bf6b041aba2184079936d40fe56543"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.338445 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.339588 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" event={"ID":"1df4a7d6-b0c2-4b00-b591-1a612bd319b6","Type":"ContainerStarted","Data":"90c2cd03b5da2efe2b2140a0201f7ff9d1487c261ab13f06a8199c5a9e03c947"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.339974 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.340867 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" event={"ID":"3a26728d-85c2-465c-bce4-c74045ea9e0d","Type":"ContainerStarted","Data":"3cc5304750c080a6debdf409cea68b446d3885f1b82989fc52ec8c5c354595f6"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.342031 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" event={"ID":"19a0f4de-5258-4f2b-9587-71293459378e","Type":"ContainerStarted","Data":"490b131fcdcfc3c4a0f9eaefa6f19529f359253225ede8f7a9b1add8a23964b5"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.342419 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.344307 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" event={"ID":"a80d01d5-0201-4b2e-974c-ac5b42ac8df4","Type":"ContainerStarted","Data":"d2bed079a6a5a2e515d93d89a9a982d254f51045181d7bcb6277083a331b61e4"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.344462 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.353958 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" event={"ID":"0ab852e1-fd26-4f76-b758-77896f8e236b","Type":"ContainerStarted","Data":"d5d0c797c988d2fbc23ad907fce7b101168cdaa8057bb8670fd809bd5d1557bd"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.357506 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" event={"ID":"fb7b2f97-fca8-41d2-9be7-d40fac94c171","Type":"ContainerStarted","Data":"9cf8061003f7f4ee16b4b5dd5a9b44b3bd2079d20345c208aa37b8286b144c96"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.357617 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.359029 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" event={"ID":"3c955ebc-98fd-4921-9923-6151a50e8eec","Type":"ContainerStarted","Data":"ac8819fe2b0b8e3b70c3aac9e58b33895d2a1b842f4a33d3cf71e845cfa1e0d7"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.359658 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.365367 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" event={"ID":"da3795a7-363f-4637-afe2-77cb77248f9a","Type":"ContainerStarted","Data":"39244065c035a66d2feea4e8ced4cdbb99d19f12dda86ff67f0baef554cd9c9b"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.366688 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.405947 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" podStartSLOduration=3.092980187 podStartE2EDuration="43.405929101s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:04.108291376 +0000 UTC m=+1259.109879615" lastFinishedPulling="2026-03-13 14:17:44.42124029 +0000 UTC m=+1299.422828529" observedRunningTime="2026-03-13 14:17:45.398556379 +0000 UTC m=+1300.400144638" watchObservedRunningTime="2026-03-13 14:17:45.405929101 +0000 UTC m=+1300.407517340" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.465900 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" podStartSLOduration=4.51132066 podStartE2EDuration="43.46587655s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.630521002 +0000 UTC m=+1260.632109241" lastFinishedPulling="2026-03-13 14:17:44.585076892 +0000 UTC m=+1299.586665131" observedRunningTime="2026-03-13 14:17:45.459655079 +0000 UTC m=+1300.461243338" watchObservedRunningTime="2026-03-13 14:17:45.46587655 +0000 UTC m=+1300.467464789" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.515881 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" podStartSLOduration=8.447947367 podStartE2EDuration="43.515862801s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:04.700442582 +0000 UTC m=+1259.702030821" lastFinishedPulling="2026-03-13 14:17:39.768358016 +0000 UTC m=+1294.769946255" observedRunningTime="2026-03-13 14:17:45.512552984 +0000 UTC m=+1300.514141243" watchObservedRunningTime="2026-03-13 14:17:45.515862801 +0000 UTC m=+1300.517451040" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.584553 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" podStartSLOduration=4.077195436 podStartE2EDuration="43.584532577s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:04.913816377 +0000 UTC m=+1259.915404616" lastFinishedPulling="2026-03-13 14:17:44.421153518 +0000 UTC m=+1299.422741757" observedRunningTime="2026-03-13 14:17:45.568747136 +0000 UTC m=+1300.570335395" watchObservedRunningTime="2026-03-13 14:17:45.584532577 +0000 UTC m=+1300.586120816" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.612358 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" podStartSLOduration=4.84424075 podStartE2EDuration="43.61234252s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.640733718 +0000 UTC m=+1260.642321957" lastFinishedPulling="2026-03-13 14:17:44.408835488 +0000 UTC m=+1299.410423727" observedRunningTime="2026-03-13 14:17:45.608196422 +0000 UTC m=+1300.609784661" watchObservedRunningTime="2026-03-13 14:17:45.61234252 +0000 UTC m=+1300.613930749" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.739623 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" podStartSLOduration=8.071543881 podStartE2EDuration="43.73960595s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:04.091304433 +0000 UTC m=+1259.092892672" lastFinishedPulling="2026-03-13 14:17:39.759366502 +0000 UTC m=+1294.760954741" observedRunningTime="2026-03-13 14:17:45.719362304 +0000 UTC m=+1300.720950553" watchObservedRunningTime="2026-03-13 14:17:45.73960595 +0000 UTC m=+1300.741194189" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.742526 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podStartSLOduration=4.481504389 podStartE2EDuration="43.742515266s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:04.733650518 +0000 UTC m=+1259.735238757" lastFinishedPulling="2026-03-13 14:17:43.994661395 +0000 UTC m=+1298.996249634" observedRunningTime="2026-03-13 14:17:45.679503867 +0000 UTC m=+1300.681092116" watchObservedRunningTime="2026-03-13 14:17:45.742515266 +0000 UTC m=+1300.744103505" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.770118 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" podStartSLOduration=3.856631913 podStartE2EDuration="43.770099693s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:04.704510828 +0000 UTC m=+1259.706099067" lastFinishedPulling="2026-03-13 14:17:44.617978608 +0000 UTC m=+1299.619566847" observedRunningTime="2026-03-13 14:17:45.768844101 +0000 UTC m=+1300.770432340" watchObservedRunningTime="2026-03-13 14:17:45.770099693 +0000 UTC m=+1300.771687932" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.801973 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" podStartSLOduration=4.068857375 podStartE2EDuration="42.801950002s" podCreationTimestamp="2026-03-13 14:17:03 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.664807466 +0000 UTC m=+1260.666395695" lastFinishedPulling="2026-03-13 14:17:44.397900083 +0000 UTC m=+1299.399488322" observedRunningTime="2026-03-13 14:17:45.800605677 +0000 UTC m=+1300.802193926" watchObservedRunningTime="2026-03-13 14:17:45.801950002 +0000 UTC m=+1300.803538261" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.827647 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" podStartSLOduration=4.592013144 podStartE2EDuration="43.827624479s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:04.744103191 +0000 UTC m=+1259.745691440" lastFinishedPulling="2026-03-13 14:17:43.979714526 +0000 UTC m=+1298.981302775" observedRunningTime="2026-03-13 14:17:45.822844105 +0000 UTC m=+1300.824432354" watchObservedRunningTime="2026-03-13 14:17:45.827624479 +0000 UTC m=+1300.829212718" Mar 13 14:17:46 crc kubenswrapper[4898]: I0313 14:17:46.393688 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" event={"ID":"9ff6f89a-7110-42fb-96b9-8611f280bebe","Type":"ContainerStarted","Data":"0633feb076d7a92b8ba593135493cb20da0d09f9e8caebe97d98583e73b5df27"} Mar 13 14:17:46 crc kubenswrapper[4898]: I0313 14:17:46.396000 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" Mar 13 14:17:46 crc kubenswrapper[4898]: I0313 14:17:46.401806 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" event={"ID":"45efd8ce-26db-4511-bd88-2e7467d02bbb","Type":"ContainerStarted","Data":"668a9795062b8fa71ac40f7ae8d277ef40b26eb48113263deda3a652735e7086"} Mar 13 14:17:46 crc kubenswrapper[4898]: I0313 14:17:46.402861 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" Mar 13 14:17:46 crc kubenswrapper[4898]: I0313 14:17:46.405662 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" event={"ID":"3a26728d-85c2-465c-bce4-c74045ea9e0d","Type":"ContainerStarted","Data":"d4370cd3a1919d89fe936f3d3e40cf45a946359d1caa65196258cd9057775ac2"} Mar 13 14:17:46 crc kubenswrapper[4898]: I0313 14:17:46.419963 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" podStartSLOduration=3.181471312 podStartE2EDuration="43.419947686s" podCreationTimestamp="2026-03-13 14:17:03 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.633324925 +0000 UTC m=+1260.634913174" lastFinishedPulling="2026-03-13 14:17:45.871801309 +0000 UTC m=+1300.873389548" observedRunningTime="2026-03-13 14:17:46.419411332 +0000 UTC m=+1301.420999581" watchObservedRunningTime="2026-03-13 14:17:46.419947686 +0000 UTC m=+1301.421535925" Mar 13 14:17:46 crc kubenswrapper[4898]: I0313 14:17:46.454748 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" podStartSLOduration=43.454732521 podStartE2EDuration="43.454732521s" podCreationTimestamp="2026-03-13 14:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:17:46.452440451 +0000 UTC m=+1301.454028690" watchObservedRunningTime="2026-03-13 14:17:46.454732521 +0000 UTC m=+1301.456320760" Mar 13 14:17:46 crc kubenswrapper[4898]: I0313 14:17:46.482816 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" podStartSLOduration=3.5852143 podStartE2EDuration="44.482796111s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:04.754298367 +0000 UTC m=+1259.755886596" lastFinishedPulling="2026-03-13 14:17:45.651880178 +0000 UTC m=+1300.653468407" observedRunningTime="2026-03-13 14:17:46.479775872 +0000 UTC m=+1301.481364121" watchObservedRunningTime="2026-03-13 14:17:46.482796111 +0000 UTC m=+1301.484384360" Mar 13 14:17:47 crc kubenswrapper[4898]: I0313 14:17:47.412708 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.427797 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" event={"ID":"0d7c657b-a701-41fe-9b23-d5bba3302c4f","Type":"ContainerStarted","Data":"e9ef8c7a7570b9f18555eb7a145b577cbda97d59d879abac51522a7ef7563575"} Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.428419 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.429827 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" event={"ID":"0ab852e1-fd26-4f76-b758-77896f8e236b","Type":"ContainerStarted","Data":"5517ef51ccfa8b2748099ff5aaad46312299246fde947335af38f7f6b1623c69"} Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.429933 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.431189 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" event={"ID":"c35de09d-7f21-47d3-aac5-a26b15b0a496","Type":"ContainerStarted","Data":"439b97cf887edefb45f2da268f25ca7e59b0c9383815a8f46efa2d00e888c19e"} Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.431356 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.450618 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" podStartSLOduration=4.109701948 podStartE2EDuration="47.450599532s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.663147733 +0000 UTC m=+1260.664735972" lastFinishedPulling="2026-03-13 14:17:49.004045317 +0000 UTC m=+1304.005633556" observedRunningTime="2026-03-13 14:17:49.446765762 +0000 UTC m=+1304.448354011" watchObservedRunningTime="2026-03-13 14:17:49.450599532 +0000 UTC m=+1304.452187771" Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.473047 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" podStartSLOduration=43.479212738 podStartE2EDuration="47.473021555s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:45.012846388 +0000 UTC m=+1300.014434627" lastFinishedPulling="2026-03-13 14:17:49.006655195 +0000 UTC m=+1304.008243444" observedRunningTime="2026-03-13 14:17:49.466629799 +0000 UTC m=+1304.468218058" watchObservedRunningTime="2026-03-13 14:17:49.473021555 +0000 UTC m=+1304.474609794" Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.506002 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" podStartSLOduration=43.425376398 podStartE2EDuration="47.505971752s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:44.92179555 +0000 UTC m=+1299.923383789" lastFinishedPulling="2026-03-13 14:17:49.002390894 +0000 UTC m=+1304.003979143" observedRunningTime="2026-03-13 14:17:49.495821278 +0000 UTC m=+1304.497409527" watchObservedRunningTime="2026-03-13 14:17:49.505971752 +0000 UTC m=+1304.507559991" Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.450276 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" event={"ID":"d24bb749-0b71-456b-80e4-fdf6dd23ba30","Type":"ContainerStarted","Data":"0b4e7587bc426e2c72edd1a3cd1c6a6f006317555be494e20de07b4d792e8cb0"} Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.451106 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.453506 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" event={"ID":"66a86c31-9ff3-439a-a0f8-96c981014b6f","Type":"ContainerStarted","Data":"88fb60edbde24ac17c310036220e101cce0c1eccdce27094156356847f85f3fc"} Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.453723 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.455208 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" event={"ID":"d71982c0-a3d0-4da8-84cd-7494301f589f","Type":"ContainerStarted","Data":"0e4e20e0e7044647f870045b4f25b0cd67f173d8704e1fb7a86a40357a37793b"} Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.455415 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.472894 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" podStartSLOduration=3.374802329 podStartE2EDuration="49.472875352s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.066081199 +0000 UTC m=+1260.067669438" lastFinishedPulling="2026-03-13 14:17:51.164154222 +0000 UTC m=+1306.165742461" observedRunningTime="2026-03-13 14:17:51.471713551 +0000 UTC m=+1306.473301800" watchObservedRunningTime="2026-03-13 14:17:51.472875352 +0000 UTC m=+1306.474463591" Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.492316 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" podStartSLOduration=3.393455715 podStartE2EDuration="49.492297007s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.066407338 +0000 UTC m=+1260.067995577" lastFinishedPulling="2026-03-13 14:17:51.16524862 +0000 UTC m=+1306.166836869" observedRunningTime="2026-03-13 14:17:51.488370835 +0000 UTC m=+1306.489959094" watchObservedRunningTime="2026-03-13 14:17:51.492297007 +0000 UTC m=+1306.493885246" Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.502131 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" podStartSLOduration=3.970020268 podStartE2EDuration="49.502109012s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.704705837 +0000 UTC m=+1260.706294076" lastFinishedPulling="2026-03-13 14:17:51.236794581 +0000 UTC m=+1306.238382820" observedRunningTime="2026-03-13 14:17:51.500704575 +0000 UTC m=+1306.502292824" watchObservedRunningTime="2026-03-13 14:17:51.502109012 +0000 UTC m=+1306.503697251" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.037309 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.038041 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.042850 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.100264 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.169442 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.214748 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.461670 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.497169 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.539568 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" Mar 13 14:17:53 crc kubenswrapper[4898]: E0313 14:17:53.741676 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podUID="919747b8-a031-4654-999f-3c3928f981b4" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.932464 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" Mar 13 14:17:54 crc kubenswrapper[4898]: I0313 14:17:54.005853 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" Mar 13 14:17:54 crc kubenswrapper[4898]: I0313 14:17:54.027668 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 14:17:54 crc kubenswrapper[4898]: E0313 14:17:54.743146 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podUID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" Mar 13 14:17:55 crc kubenswrapper[4898]: I0313 14:17:55.102232 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:55 crc kubenswrapper[4898]: I0313 14:17:55.995117 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:57 crc kubenswrapper[4898]: I0313 14:17:57.504119 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" event={"ID":"52959483-daae-423a-a3bf-8e3fa7810074","Type":"ContainerStarted","Data":"ce8ee2b018c365558145424246baad3faea16352a1e7f13624ea91bdda3e4f6f"} Mar 13 14:17:57 crc kubenswrapper[4898]: I0313 14:17:57.504598 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" Mar 13 14:17:57 crc kubenswrapper[4898]: I0313 14:17:57.524381 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" podStartSLOduration=3.974593632 podStartE2EDuration="55.524361969s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.638536501 +0000 UTC m=+1260.640124740" lastFinishedPulling="2026-03-13 14:17:57.188304838 +0000 UTC m=+1312.189893077" observedRunningTime="2026-03-13 14:17:57.518837685 +0000 UTC m=+1312.520425964" watchObservedRunningTime="2026-03-13 14:17:57.524361969 +0000 UTC m=+1312.525950208" Mar 13 14:17:57 crc kubenswrapper[4898]: E0313 14:17:57.742077 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" podUID="7b9c0413-5558-43c4-805b-7f035fded9b4" Mar 13 14:17:59 crc kubenswrapper[4898]: I0313 14:17:59.537267 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.140838 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556858-vdbkv"] Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.142214 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.145575 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.146015 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.146200 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.151099 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556858-vdbkv"] Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.207119 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgkbn\" (UniqueName: \"kubernetes.io/projected/1b0610af-1f13-4f43-9249-8d50a0dcbc14-kube-api-access-qgkbn\") pod \"auto-csr-approver-29556858-vdbkv\" (UID: \"1b0610af-1f13-4f43-9249-8d50a0dcbc14\") " pod="openshift-infra/auto-csr-approver-29556858-vdbkv" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.308163 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgkbn\" (UniqueName: \"kubernetes.io/projected/1b0610af-1f13-4f43-9249-8d50a0dcbc14-kube-api-access-qgkbn\") pod \"auto-csr-approver-29556858-vdbkv\" (UID: \"1b0610af-1f13-4f43-9249-8d50a0dcbc14\") " pod="openshift-infra/auto-csr-approver-29556858-vdbkv" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.336182 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgkbn\" (UniqueName: \"kubernetes.io/projected/1b0610af-1f13-4f43-9249-8d50a0dcbc14-kube-api-access-qgkbn\") pod \"auto-csr-approver-29556858-vdbkv\" (UID: \"1b0610af-1f13-4f43-9249-8d50a0dcbc14\") " pod="openshift-infra/auto-csr-approver-29556858-vdbkv" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.461882 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.920672 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556858-vdbkv"] Mar 13 14:18:01 crc kubenswrapper[4898]: I0313 14:18:01.546845 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" event={"ID":"1b0610af-1f13-4f43-9249-8d50a0dcbc14","Type":"ContainerStarted","Data":"265ce71be2ddaa7e76a9da2460c089ad22f9323bae84430d6127920e01c0a1f4"} Mar 13 14:18:02 crc kubenswrapper[4898]: I0313 14:18:02.555183 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" event={"ID":"1b0610af-1f13-4f43-9249-8d50a0dcbc14","Type":"ContainerStarted","Data":"c8c6599b57d68b7830c9784f9ac2322559fa7b500359ca53fa26e39b23292ec4"} Mar 13 14:18:02 crc kubenswrapper[4898]: I0313 14:18:02.567550 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" podStartSLOduration=1.333563935 podStartE2EDuration="2.567529161s" podCreationTimestamp="2026-03-13 14:18:00 +0000 UTC" firstStartedPulling="2026-03-13 14:18:00.92582374 +0000 UTC m=+1315.927411979" lastFinishedPulling="2026-03-13 14:18:02.159788956 +0000 UTC m=+1317.161377205" observedRunningTime="2026-03-13 14:18:02.565790246 +0000 UTC m=+1317.567378485" watchObservedRunningTime="2026-03-13 14:18:02.567529161 +0000 UTC m=+1317.569117400" Mar 13 14:18:03 crc kubenswrapper[4898]: I0313 14:18:03.507829 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" Mar 13 14:18:03 crc kubenswrapper[4898]: I0313 14:18:03.546061 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" Mar 13 14:18:03 crc kubenswrapper[4898]: I0313 14:18:03.553727 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" Mar 13 14:18:03 crc kubenswrapper[4898]: I0313 14:18:03.566999 4898 generic.go:334] "Generic (PLEG): container finished" podID="1b0610af-1f13-4f43-9249-8d50a0dcbc14" containerID="c8c6599b57d68b7830c9784f9ac2322559fa7b500359ca53fa26e39b23292ec4" exitCode=0 Mar 13 14:18:03 crc kubenswrapper[4898]: I0313 14:18:03.567043 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" event={"ID":"1b0610af-1f13-4f43-9249-8d50a0dcbc14","Type":"ContainerDied","Data":"c8c6599b57d68b7830c9784f9ac2322559fa7b500359ca53fa26e39b23292ec4"} Mar 13 14:18:03 crc kubenswrapper[4898]: I0313 14:18:03.956694 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" Mar 13 14:18:03 crc kubenswrapper[4898]: I0313 14:18:03.982360 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" Mar 13 14:18:04 crc kubenswrapper[4898]: I0313 14:18:04.911158 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.094128 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgkbn\" (UniqueName: \"kubernetes.io/projected/1b0610af-1f13-4f43-9249-8d50a0dcbc14-kube-api-access-qgkbn\") pod \"1b0610af-1f13-4f43-9249-8d50a0dcbc14\" (UID: \"1b0610af-1f13-4f43-9249-8d50a0dcbc14\") " Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.100309 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b0610af-1f13-4f43-9249-8d50a0dcbc14-kube-api-access-qgkbn" (OuterVolumeSpecName: "kube-api-access-qgkbn") pod "1b0610af-1f13-4f43-9249-8d50a0dcbc14" (UID: "1b0610af-1f13-4f43-9249-8d50a0dcbc14"). InnerVolumeSpecName "kube-api-access-qgkbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.202075 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgkbn\" (UniqueName: \"kubernetes.io/projected/1b0610af-1f13-4f43-9249-8d50a0dcbc14-kube-api-access-qgkbn\") on node \"crc\" DevicePath \"\"" Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.584705 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" event={"ID":"1b0610af-1f13-4f43-9249-8d50a0dcbc14","Type":"ContainerDied","Data":"265ce71be2ddaa7e76a9da2460c089ad22f9323bae84430d6127920e01c0a1f4"} Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.584757 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="265ce71be2ddaa7e76a9da2460c089ad22f9323bae84430d6127920e01c0a1f4" Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.584786 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.636961 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556852-z8l5q"] Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.644371 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556852-z8l5q"] Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.751470 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da054881-deef-4491-9685-5f35ee9fc45f" path="/var/lib/kubelet/pods/da054881-deef-4491-9685-5f35ee9fc45f/volumes" Mar 13 14:18:07 crc kubenswrapper[4898]: I0313 14:18:07.602417 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" event={"ID":"919747b8-a031-4654-999f-3c3928f981b4","Type":"ContainerStarted","Data":"2a63b3120bf0b6fde648902ea96aff3bdb84041187c2d2f323a87c8815000e30"} Mar 13 14:18:07 crc kubenswrapper[4898]: I0313 14:18:07.603069 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" Mar 13 14:18:07 crc kubenswrapper[4898]: I0313 14:18:07.619728 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podStartSLOduration=3.099097083 podStartE2EDuration="1m4.619706527s" podCreationTimestamp="2026-03-13 14:17:03 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.665048103 +0000 UTC m=+1260.666636332" lastFinishedPulling="2026-03-13 14:18:07.185657537 +0000 UTC m=+1322.187245776" observedRunningTime="2026-03-13 14:18:07.615508968 +0000 UTC m=+1322.617097247" watchObservedRunningTime="2026-03-13 14:18:07.619706527 +0000 UTC m=+1322.621294766" Mar 13 14:18:11 crc kubenswrapper[4898]: I0313 14:18:11.650395 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" event={"ID":"d29ce3ee-3d5a-4801-abf9-dfef5b641a74","Type":"ContainerStarted","Data":"f59d39668fd5f01a9f140fcfa1fd92ce8c2ea3f200013f1b1e08b5b0e679790d"} Mar 13 14:18:11 crc kubenswrapper[4898]: I0313 14:18:11.652025 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 14:18:11 crc kubenswrapper[4898]: I0313 14:18:11.675223 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podStartSLOduration=4.816303157 podStartE2EDuration="1m9.67520369s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.66877209 +0000 UTC m=+1260.670360329" lastFinishedPulling="2026-03-13 14:18:10.527672623 +0000 UTC m=+1325.529260862" observedRunningTime="2026-03-13 14:18:11.668086105 +0000 UTC m=+1326.669674354" watchObservedRunningTime="2026-03-13 14:18:11.67520369 +0000 UTC m=+1326.676791929" Mar 13 14:18:12 crc kubenswrapper[4898]: I0313 14:18:12.669101 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" event={"ID":"7b9c0413-5558-43c4-805b-7f035fded9b4","Type":"ContainerStarted","Data":"e5c2d747b8dc9a1a0b04a2711ae93e1c3479dbb6eb71a1261f593abef3d31048"} Mar 13 14:18:12 crc kubenswrapper[4898]: I0313 14:18:12.687191 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" podStartSLOduration=3.701077665 podStartE2EDuration="1m9.687169771s" podCreationTimestamp="2026-03-13 14:17:03 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.756015854 +0000 UTC m=+1260.757604093" lastFinishedPulling="2026-03-13 14:18:11.74210796 +0000 UTC m=+1326.743696199" observedRunningTime="2026-03-13 14:18:12.681804622 +0000 UTC m=+1327.683392871" watchObservedRunningTime="2026-03-13 14:18:12.687169771 +0000 UTC m=+1327.688758020" Mar 13 14:18:14 crc kubenswrapper[4898]: I0313 14:18:14.091380 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" Mar 13 14:18:19 crc kubenswrapper[4898]: I0313 14:18:19.134138 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:18:19 crc kubenswrapper[4898]: I0313 14:18:19.134736 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:18:23 crc kubenswrapper[4898]: I0313 14:18:23.884201 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 14:18:32 crc kubenswrapper[4898]: I0313 14:18:32.915589 4898 scope.go:117] "RemoveContainer" containerID="5010d4732869bc8d7e0532b7c193085ea8336efd3a5d0f8cd686f3caf758e4d9" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.399523 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-npnz4"] Mar 13 14:18:42 crc kubenswrapper[4898]: E0313 14:18:42.400535 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b0610af-1f13-4f43-9249-8d50a0dcbc14" containerName="oc" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.400549 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b0610af-1f13-4f43-9249-8d50a0dcbc14" containerName="oc" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.400782 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b0610af-1f13-4f43-9249-8d50a0dcbc14" containerName="oc" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.401856 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.404261 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-wwqwh" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.410295 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.410401 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.410641 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.411890 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5005de8-b440-45e8-a1a7-7943f68bff2f-config\") pod \"dnsmasq-dns-675f4bcbfc-npnz4\" (UID: \"b5005de8-b440-45e8-a1a7-7943f68bff2f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.412012 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glpmk\" (UniqueName: \"kubernetes.io/projected/b5005de8-b440-45e8-a1a7-7943f68bff2f-kube-api-access-glpmk\") pod \"dnsmasq-dns-675f4bcbfc-npnz4\" (UID: \"b5005de8-b440-45e8-a1a7-7943f68bff2f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.424759 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-npnz4"] Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.465149 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5jhct"] Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.466921 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.469413 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.479968 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5jhct"] Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.513619 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glpmk\" (UniqueName: \"kubernetes.io/projected/b5005de8-b440-45e8-a1a7-7943f68bff2f-kube-api-access-glpmk\") pod \"dnsmasq-dns-675f4bcbfc-npnz4\" (UID: \"b5005de8-b440-45e8-a1a7-7943f68bff2f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.513790 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5005de8-b440-45e8-a1a7-7943f68bff2f-config\") pod \"dnsmasq-dns-675f4bcbfc-npnz4\" (UID: \"b5005de8-b440-45e8-a1a7-7943f68bff2f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.514836 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5005de8-b440-45e8-a1a7-7943f68bff2f-config\") pod \"dnsmasq-dns-675f4bcbfc-npnz4\" (UID: \"b5005de8-b440-45e8-a1a7-7943f68bff2f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.550792 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glpmk\" (UniqueName: \"kubernetes.io/projected/b5005de8-b440-45e8-a1a7-7943f68bff2f-kube-api-access-glpmk\") pod \"dnsmasq-dns-675f4bcbfc-npnz4\" (UID: \"b5005de8-b440-45e8-a1a7-7943f68bff2f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.616814 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-config\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.616881 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6xth\" (UniqueName: \"kubernetes.io/projected/70dc5baf-6ae1-41b4-9454-8ff891570f8b-kube-api-access-h6xth\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.616954 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.718609 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-config\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.718917 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6xth\" (UniqueName: \"kubernetes.io/projected/70dc5baf-6ae1-41b4-9454-8ff891570f8b-kube-api-access-h6xth\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.719048 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.719527 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-config\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.720443 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.725755 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.737566 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6xth\" (UniqueName: \"kubernetes.io/projected/70dc5baf-6ae1-41b4-9454-8ff891570f8b-kube-api-access-h6xth\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.797770 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:43 crc kubenswrapper[4898]: I0313 14:18:43.212548 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-npnz4"] Mar 13 14:18:43 crc kubenswrapper[4898]: W0313 14:18:43.214543 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5005de8_b440_45e8_a1a7_7943f68bff2f.slice/crio-64fa491215cb47b1ae2d24aaa439a281949404c6099493e0af3976068bb840f1 WatchSource:0}: Error finding container 64fa491215cb47b1ae2d24aaa439a281949404c6099493e0af3976068bb840f1: Status 404 returned error can't find the container with id 64fa491215cb47b1ae2d24aaa439a281949404c6099493e0af3976068bb840f1 Mar 13 14:18:43 crc kubenswrapper[4898]: I0313 14:18:43.311037 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5jhct"] Mar 13 14:18:43 crc kubenswrapper[4898]: W0313 14:18:43.311613 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70dc5baf_6ae1_41b4_9454_8ff891570f8b.slice/crio-41cc016dc80d4b785612e7bb953345c49daec5b78f73d536749225e85b77217d WatchSource:0}: Error finding container 41cc016dc80d4b785612e7bb953345c49daec5b78f73d536749225e85b77217d: Status 404 returned error can't find the container with id 41cc016dc80d4b785612e7bb953345c49daec5b78f73d536749225e85b77217d Mar 13 14:18:43 crc kubenswrapper[4898]: I0313 14:18:43.987287 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" event={"ID":"b5005de8-b440-45e8-a1a7-7943f68bff2f","Type":"ContainerStarted","Data":"64fa491215cb47b1ae2d24aaa439a281949404c6099493e0af3976068bb840f1"} Mar 13 14:18:43 crc kubenswrapper[4898]: I0313 14:18:43.989015 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" event={"ID":"70dc5baf-6ae1-41b4-9454-8ff891570f8b","Type":"ContainerStarted","Data":"41cc016dc80d4b785612e7bb953345c49daec5b78f73d536749225e85b77217d"} Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.306275 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-npnz4"] Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.355025 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nmlp6"] Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.356597 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.375480 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nmlp6"] Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.385104 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5ll2\" (UniqueName: \"kubernetes.io/projected/c17db307-7a8a-4585-9696-a9ef96b6ba0b-kube-api-access-v5ll2\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.385184 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.385336 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-config\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.487706 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5ll2\" (UniqueName: \"kubernetes.io/projected/c17db307-7a8a-4585-9696-a9ef96b6ba0b-kube-api-access-v5ll2\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.487775 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.487808 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-config\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.488889 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-config\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.488929 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.512623 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5ll2\" (UniqueName: \"kubernetes.io/projected/c17db307-7a8a-4585-9696-a9ef96b6ba0b-kube-api-access-v5ll2\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.664301 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5jhct"] Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.683911 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.702007 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hk9w4"] Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.703667 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.708839 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hk9w4"] Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.793702 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.793753 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-config\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.793919 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn79l\" (UniqueName: \"kubernetes.io/projected/9e544d1f-357e-4751-88bb-5108430b52cb-kube-api-access-sn79l\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.895525 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn79l\" (UniqueName: \"kubernetes.io/projected/9e544d1f-357e-4751-88bb-5108430b52cb-kube-api-access-sn79l\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.895590 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.895623 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-config\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.897181 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-config\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.897268 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.915476 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn79l\" (UniqueName: \"kubernetes.io/projected/9e544d1f-357e-4751-88bb-5108430b52cb-kube-api-access-sn79l\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.167478 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.347991 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nmlp6"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.527746 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.529384 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.532105 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.532357 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4m6nk" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.532497 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.533003 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.534125 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.534588 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.534619 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.539500 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.608757 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.608806 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.608825 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.608872 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.608890 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d56bd826-4f42-409d-ae41-9bfc70d1e038-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.608930 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.608959 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.608986 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.609022 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d56bd826-4f42-409d-ae41-9bfc70d1e038-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.609039 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.609071 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk5mq\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-kube-api-access-xk5mq\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.681799 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hk9w4"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710599 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710658 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d56bd826-4f42-409d-ae41-9bfc70d1e038-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710734 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk5mq\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-kube-api-access-xk5mq\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710766 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710805 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710831 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710911 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710941 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d56bd826-4f42-409d-ae41-9bfc70d1e038-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710976 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.711009 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.711491 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.711650 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.711721 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.712271 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.713006 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.713022 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.715556 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.715580 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3111f327615e010747f22a13f9378eff3b7d96c403da97ea4361402b1c85d196/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.716040 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d56bd826-4f42-409d-ae41-9bfc70d1e038-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.716064 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.717459 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.719583 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d56bd826-4f42-409d-ae41-9bfc70d1e038-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.732984 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk5mq\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-kube-api-access-xk5mq\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.765717 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.790673 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.794726 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.797948 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-tpvjl" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.799859 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.800002 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.800106 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.800213 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.800411 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.800540 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.833009 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.842463 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.844086 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.857742 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.859539 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.882438 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.890460 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.893989 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914298 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914352 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914398 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8xbw\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-kube-api-access-q8xbw\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914436 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914485 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-config-data\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914515 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914549 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914566 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914582 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914597 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914631 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016197 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016452 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016478 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016503 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016521 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-server-conf\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016540 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8xbw\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-kube-api-access-q8xbw\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016564 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016594 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/818e3f41-30c4-4a49-b490-0d868fc2b2b8-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016626 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-config-data\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016648 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016665 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016682 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016701 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/818e3f41-30c4-4a49-b490-0d868fc2b2b8-pod-info\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016719 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016738 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016761 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016784 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016803 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016821 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016840 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-config-data\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016857 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016874 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016892 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016925 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47mbl\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-kube-api-access-47mbl\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.017493 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.017944 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.018135 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-config-data\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.018264 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-server-conf\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.018610 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-pod-info\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.018694 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-config-data\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.018811 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.019181 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.019305 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.019343 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgsn2\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-kube-api-access-kgsn2\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.019387 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.019435 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.019443 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.019500 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.022072 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.022462 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.024178 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.024198 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/235b7df56c251cb078c850d3b743a7085fdda6b090aa4cee8a1308b947278440/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.024644 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.026092 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.034921 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8xbw\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-kube-api-access-q8xbw\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.066768 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.083140 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" event={"ID":"9e544d1f-357e-4751-88bb-5108430b52cb","Type":"ContainerStarted","Data":"f4e0bf3960a9198c7dbd49808ca83e6770f6cbaf7ea545029dc2173d4eb03419"} Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.090232 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" event={"ID":"c17db307-7a8a-4585-9696-a9ef96b6ba0b","Type":"ContainerStarted","Data":"98527ac55b34245c21c2fc19c06bf03fb117bb3cf7f7b539444d59d7d2dad50b"} Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.120838 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/818e3f41-30c4-4a49-b490-0d868fc2b2b8-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.120930 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.120956 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121298 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121325 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/818e3f41-30c4-4a49-b490-0d868fc2b2b8-pod-info\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121347 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121385 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121417 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121446 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121472 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-config-data\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121497 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47mbl\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-kube-api-access-47mbl\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121514 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-server-conf\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121541 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-pod-info\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121560 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-config-data\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121588 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121604 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgsn2\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-kube-api-access-kgsn2\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.122237 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.122587 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.122615 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.122636 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.122649 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-config-data\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.122683 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.122795 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.122814 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-server-conf\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.123147 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-config-data\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.123505 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.123645 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-server-conf\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.124418 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.124542 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-server-conf\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.124781 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.125150 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.125172 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/33865dbdc5fe61694c30892e6300309b59f04bdd0b35aa3fd0f17da3ba922194/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.126286 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.127074 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/818e3f41-30c4-4a49-b490-0d868fc2b2b8-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.128317 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.128607 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.128631 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b5b057b78b5a76d291625b9af6af2e0e662115b1b100b445e2e40d0ac02a65c7/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.128910 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.132714 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.132758 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-pod-info\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.132868 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.133427 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/818e3f41-30c4-4a49-b490-0d868fc2b2b8-pod-info\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.136116 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.137004 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.142789 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.148632 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47mbl\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-kube-api-access-47mbl\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.159802 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgsn2\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-kube-api-access-kgsn2\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.181662 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.210878 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.223968 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.271499 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.504428 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.729973 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.886525 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.989344 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.991046 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.993951 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.994407 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-jw2rr" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.998261 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.000748 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.001163 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.003751 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.008329 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.155862 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk6v5\" (UniqueName: \"kubernetes.io/projected/e5d53cf3-113e-4391-b3a9-4e1f81836e26-kube-api-access-gk6v5\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.155928 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d53cf3-113e-4391-b3a9-4e1f81836e26-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.155975 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d53cf3-113e-4391-b3a9-4e1f81836e26-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.156020 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5d53cf3-113e-4391-b3a9-4e1f81836e26-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.156049 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-kolla-config\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.156075 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-config-data-default\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.156089 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.156118 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-528ccd43-eea9-4215-9d1e-4da3f303db3c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-528ccd43-eea9-4215-9d1e-4da3f303db3c\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.258357 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk6v5\" (UniqueName: \"kubernetes.io/projected/e5d53cf3-113e-4391-b3a9-4e1f81836e26-kube-api-access-gk6v5\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.258413 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d53cf3-113e-4391-b3a9-4e1f81836e26-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.258457 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d53cf3-113e-4391-b3a9-4e1f81836e26-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.258501 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5d53cf3-113e-4391-b3a9-4e1f81836e26-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.258544 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-kolla-config\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.258576 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-config-data-default\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.258598 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.258625 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-528ccd43-eea9-4215-9d1e-4da3f303db3c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-528ccd43-eea9-4215-9d1e-4da3f303db3c\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.260984 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-config-data-default\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.261019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5d53cf3-113e-4391-b3a9-4e1f81836e26-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.263685 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.264553 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-kolla-config\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.272383 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d53cf3-113e-4391-b3a9-4e1f81836e26-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.276516 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.276558 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-528ccd43-eea9-4215-9d1e-4da3f303db3c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-528ccd43-eea9-4215-9d1e-4da3f303db3c\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/89901677853a048b74c0117bd23de07724e79b5f56f64f8503c2bc2692c943e7/globalmount\"" pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.289198 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d53cf3-113e-4391-b3a9-4e1f81836e26-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.294246 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk6v5\" (UniqueName: \"kubernetes.io/projected/e5d53cf3-113e-4391-b3a9-4e1f81836e26-kube-api-access-gk6v5\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.325298 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-528ccd43-eea9-4215-9d1e-4da3f303db3c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-528ccd43-eea9-4215-9d1e-4da3f303db3c\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.617430 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.134180 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.134264 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.507032 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.511926 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.515464 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.516159 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.516459 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-lmh2w" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.516774 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.518615 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.596109 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.596194 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.596222 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.596260 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfbqp\" (UniqueName: \"kubernetes.io/projected/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-kube-api-access-dfbqp\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.596300 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.596384 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-225af9fc-6e90-4484-ba03-b804922e0132\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-225af9fc-6e90-4484-ba03-b804922e0132\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.596413 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.596436 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.698285 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-225af9fc-6e90-4484-ba03-b804922e0132\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-225af9fc-6e90-4484-ba03-b804922e0132\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.698348 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.698375 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.698476 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.698530 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.698554 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.698609 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfbqp\" (UniqueName: \"kubernetes.io/projected/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-kube-api-access-dfbqp\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.698656 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.700290 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.701613 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.706624 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.707829 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.709101 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.709642 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.709690 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-225af9fc-6e90-4484-ba03-b804922e0132\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-225af9fc-6e90-4484-ba03-b804922e0132\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/94535bbcb82a4e5c35e582e7068f21e1d0f0f4fb265f745dad2e0267fea0e923/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.711913 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.734844 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfbqp\" (UniqueName: \"kubernetes.io/projected/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-kube-api-access-dfbqp\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.798093 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-225af9fc-6e90-4484-ba03-b804922e0132\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-225af9fc-6e90-4484-ba03-b804922e0132\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.824915 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.827892 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.833825 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.834206 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.834944 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-6q4ph" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.836673 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.845557 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.909998 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/67ef28b0-acc3-400e-8296-a541fc3b89f0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.910073 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/67ef28b0-acc3-400e-8296-a541fc3b89f0-kolla-config\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.910090 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67ef28b0-acc3-400e-8296-a541fc3b89f0-config-data\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.910108 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdjbw\" (UniqueName: \"kubernetes.io/projected/67ef28b0-acc3-400e-8296-a541fc3b89f0-kube-api-access-pdjbw\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.910128 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ef28b0-acc3-400e-8296-a541fc3b89f0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.012814 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/67ef28b0-acc3-400e-8296-a541fc3b89f0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.012995 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/67ef28b0-acc3-400e-8296-a541fc3b89f0-kolla-config\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.013023 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67ef28b0-acc3-400e-8296-a541fc3b89f0-config-data\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.013051 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdjbw\" (UniqueName: \"kubernetes.io/projected/67ef28b0-acc3-400e-8296-a541fc3b89f0-kube-api-access-pdjbw\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.014098 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/67ef28b0-acc3-400e-8296-a541fc3b89f0-kolla-config\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.014197 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67ef28b0-acc3-400e-8296-a541fc3b89f0-config-data\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.014240 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ef28b0-acc3-400e-8296-a541fc3b89f0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.037871 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/67ef28b0-acc3-400e-8296-a541fc3b89f0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.038044 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ef28b0-acc3-400e-8296-a541fc3b89f0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.047259 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdjbw\" (UniqueName: \"kubernetes.io/projected/67ef28b0-acc3-400e-8296-a541fc3b89f0-kube-api-access-pdjbw\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.165339 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.058306 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.066367 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.068405 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-k9zs2" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.073687 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.182920 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p58m\" (UniqueName: \"kubernetes.io/projected/4e010381-921d-4328-9027-ddb9a54a08bd-kube-api-access-5p58m\") pod \"kube-state-metrics-0\" (UID: \"4e010381-921d-4328-9027-ddb9a54a08bd\") " pod="openstack/kube-state-metrics-0" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.289985 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p58m\" (UniqueName: \"kubernetes.io/projected/4e010381-921d-4328-9027-ddb9a54a08bd-kube-api-access-5p58m\") pod \"kube-state-metrics-0\" (UID: \"4e010381-921d-4328-9027-ddb9a54a08bd\") " pod="openstack/kube-state-metrics-0" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.319705 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p58m\" (UniqueName: \"kubernetes.io/projected/4e010381-921d-4328-9027-ddb9a54a08bd-kube-api-access-5p58m\") pod \"kube-state-metrics-0\" (UID: \"4e010381-921d-4328-9027-ddb9a54a08bd\") " pod="openstack/kube-state-metrics-0" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.394115 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 14:18:52 crc kubenswrapper[4898]: W0313 14:18:52.662204 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fa7fb2f_de19_48b1_8226_d7f85a5f8f2b.slice/crio-6d987febc3dce3076fcb022da351b81dbf39766acfe2182896067f106a438fc2 WatchSource:0}: Error finding container 6d987febc3dce3076fcb022da351b81dbf39766acfe2182896067f106a438fc2: Status 404 returned error can't find the container with id 6d987febc3dce3076fcb022da351b81dbf39766acfe2182896067f106a438fc2 Mar 13 14:18:52 crc kubenswrapper[4898]: W0313 14:18:52.675918 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee084354_4d32_4d3c_96a4_1e4e7eef5d85.slice/crio-f525c8a2018341f2e27494e3687eb7a4563181188f5faaa51225478656a928a1 WatchSource:0}: Error finding container f525c8a2018341f2e27494e3687eb7a4563181188f5faaa51225478656a928a1: Status 404 returned error can't find the container with id f525c8a2018341f2e27494e3687eb7a4563181188f5faaa51225478656a928a1 Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.965891 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b"] Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.967337 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.973650 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.974350 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-782s7" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.979127 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b"] Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.106723 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad052248-8fcd-4ef6-9969-5023b87bbbf9-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-gpj8b\" (UID: \"ad052248-8fcd-4ef6-9969-5023b87bbbf9\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.107054 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6qck\" (UniqueName: \"kubernetes.io/projected/ad052248-8fcd-4ef6-9969-5023b87bbbf9-kube-api-access-h6qck\") pod \"observability-ui-dashboards-66cbf594b5-gpj8b\" (UID: \"ad052248-8fcd-4ef6-9969-5023b87bbbf9\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.208346 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad052248-8fcd-4ef6-9969-5023b87bbbf9-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-gpj8b\" (UID: \"ad052248-8fcd-4ef6-9969-5023b87bbbf9\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.208410 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6qck\" (UniqueName: \"kubernetes.io/projected/ad052248-8fcd-4ef6-9969-5023b87bbbf9-kube-api-access-h6qck\") pod \"observability-ui-dashboards-66cbf594b5-gpj8b\" (UID: \"ad052248-8fcd-4ef6-9969-5023b87bbbf9\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.212227 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ee084354-4d32-4d3c-96a4-1e4e7eef5d85","Type":"ContainerStarted","Data":"f525c8a2018341f2e27494e3687eb7a4563181188f5faaa51225478656a928a1"} Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.220241 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad052248-8fcd-4ef6-9969-5023b87bbbf9-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-gpj8b\" (UID: \"ad052248-8fcd-4ef6-9969-5023b87bbbf9\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.221139 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b","Type":"ContainerStarted","Data":"6d987febc3dce3076fcb022da351b81dbf39766acfe2182896067f106a438fc2"} Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.224444 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"818e3f41-30c4-4a49-b490-0d868fc2b2b8","Type":"ContainerStarted","Data":"6fe4bdbf2db945955ec1dd2e86e519172f05f6c43d7d6ac216668fd59e9bda42"} Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.228199 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d56bd826-4f42-409d-ae41-9bfc70d1e038","Type":"ContainerStarted","Data":"8cd6b4a73f7f67c36783e2cd3de871dd93389c4f889e74a44de4a7253a7e9a9c"} Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.240447 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6qck\" (UniqueName: \"kubernetes.io/projected/ad052248-8fcd-4ef6-9969-5023b87bbbf9-kube-api-access-h6qck\") pod \"observability-ui-dashboards-66cbf594b5-gpj8b\" (UID: \"ad052248-8fcd-4ef6-9969-5023b87bbbf9\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.285245 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.302565 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-699d95d586-ds75f"] Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.303855 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.310606 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-699d95d586-ds75f"] Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.353865 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.367762 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.373146 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.373450 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-g7dw2" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.373555 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.373590 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.373290 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.380984 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.389250 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.405221 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414161 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414221 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414245 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f526abbc-e646-48b4-afa8-7f95f4a607a0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414269 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-serving-cert\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414320 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414354 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k845s\" (UniqueName: \"kubernetes.io/projected/ab8664f8-1960-4442-9fdd-9711ec963e1f-kube-api-access-k845s\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414409 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-trusted-ca-bundle\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414458 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68mzv\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-kube-api-access-68mzv\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414496 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414518 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-config\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414551 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-config\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414574 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414611 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-service-ca\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414631 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414656 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-oauth-config\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414683 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-oauth-serving-cert\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414717 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.415113 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.515955 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-trusted-ca-bundle\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516009 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68mzv\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-kube-api-access-68mzv\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516046 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516069 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-config\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516099 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-config\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516117 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516148 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-service-ca\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516168 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516185 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-oauth-config\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516207 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-oauth-serving-cert\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516235 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516259 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516280 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516295 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f526abbc-e646-48b4-afa8-7f95f4a607a0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516312 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-serving-cert\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516346 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516375 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k845s\" (UniqueName: \"kubernetes.io/projected/ab8664f8-1960-4442-9fdd-9711ec963e1f-kube-api-access-k845s\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.517570 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-trusted-ca-bundle\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.518175 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.519062 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-service-ca\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.519536 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-oauth-serving-cert\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.519586 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-config\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.520007 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.520397 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.520827 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-config\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.521717 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.522577 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f526abbc-e646-48b4-afa8-7f95f4a607a0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.522741 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.522765 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7123a2111c2c1fcd673a2fa4cbaef2c14fcdb159a9a269edbe99c5cdea18ee2d/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.523743 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.525443 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-oauth-config\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.534481 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-serving-cert\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.538289 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k845s\" (UniqueName: \"kubernetes.io/projected/ab8664f8-1960-4442-9fdd-9711ec963e1f-kube-api-access-k845s\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.539946 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68mzv\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-kube-api-access-68mzv\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.545655 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.578624 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.624045 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.684808 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.676658 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.834635 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-j79bj"] Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.836239 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j79bj" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.838572 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.838594 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-74z8j" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.838937 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.843037 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j79bj"] Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.890317 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-r9tmf"] Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.894742 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.908845 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-r9tmf"] Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960557 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-scripts\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960618 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-log-ovn\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960651 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5kxv\" (UniqueName: \"kubernetes.io/projected/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-kube-api-access-s5kxv\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960670 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-run\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960699 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-etc-ovs\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960776 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-run-ovn\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960796 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-ovn-controller-tls-certs\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960825 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-log\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960839 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f6dm\" (UniqueName: \"kubernetes.io/projected/f71b72a8-f179-454c-8d2e-4ac829842622-kube-api-access-4f6dm\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960874 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f71b72a8-f179-454c-8d2e-4ac829842622-scripts\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960982 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-combined-ca-bundle\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.961495 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-lib\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.961513 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-run\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.063877 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-log\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.063955 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f6dm\" (UniqueName: \"kubernetes.io/projected/f71b72a8-f179-454c-8d2e-4ac829842622-kube-api-access-4f6dm\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064024 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f71b72a8-f179-454c-8d2e-4ac829842622-scripts\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064069 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-combined-ca-bundle\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064170 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-lib\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064196 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-run\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064263 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-scripts\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064308 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-log-ovn\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064348 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5kxv\" (UniqueName: \"kubernetes.io/projected/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-kube-api-access-s5kxv\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064378 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-run\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064418 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-etc-ovs\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064454 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-run-ovn\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064473 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-ovn-controller-tls-certs\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.066675 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-run\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.068591 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f71b72a8-f179-454c-8d2e-4ac829842622-scripts\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.068758 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-log\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.069195 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-lib\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.071317 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-combined-ca-bundle\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.071446 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-scripts\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.071479 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-etc-ovs\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.071525 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-run\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.071549 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-log-ovn\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.071597 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-run-ovn\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.092977 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-ovn-controller-tls-certs\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.099328 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5kxv\" (UniqueName: \"kubernetes.io/projected/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-kube-api-access-s5kxv\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.100627 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f6dm\" (UniqueName: \"kubernetes.io/projected/f71b72a8-f179-454c-8d2e-4ac829842622-kube-api-access-4f6dm\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.169053 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.216971 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.440814 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.443049 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.445758 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-fkzxt" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.445840 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.445850 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.445770 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.446295 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.448976 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.575025 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-61cc1b1a-16a4-4a15-a961-1c115f8ab82c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61cc1b1a-16a4-4a15-a961-1c115f8ab82c\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.575158 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbn2v\" (UniqueName: \"kubernetes.io/projected/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-kube-api-access-vbn2v\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.575182 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-config\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.575212 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.575227 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.575259 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.575322 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.575353 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.676688 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.676783 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.676812 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.676871 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-61cc1b1a-16a4-4a15-a961-1c115f8ab82c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61cc1b1a-16a4-4a15-a961-1c115f8ab82c\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.676992 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbn2v\" (UniqueName: \"kubernetes.io/projected/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-kube-api-access-vbn2v\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.677016 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-config\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.677050 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.677077 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.678082 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.678463 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-config\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.678754 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.681143 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.681172 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-61cc1b1a-16a4-4a15-a961-1c115f8ab82c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61cc1b1a-16a4-4a15-a961-1c115f8ab82c\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f945661a94c6edc4b169d60f552ff5af0e79f3b05828be23f5404777cfe64975/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.681654 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.683098 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.686476 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.699386 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbn2v\" (UniqueName: \"kubernetes.io/projected/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-kube-api-access-vbn2v\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.721769 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-61cc1b1a-16a4-4a15-a961-1c115f8ab82c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61cc1b1a-16a4-4a15-a961-1c115f8ab82c\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.773119 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.094883 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.098264 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.100561 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-7xr2x" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.103185 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.103483 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.103654 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.115467 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.152117 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.152158 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/111bf23f-be00-46ab-97fe-a36465735164-config\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.152187 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b47f8b75-85d2-4ff3-8129-97e8e65a1c3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47f8b75-85d2-4ff3-8129-97e8e65a1c3b\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.152211 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/111bf23f-be00-46ab-97fe-a36465735164-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.152558 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.152894 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/111bf23f-be00-46ab-97fe-a36465735164-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.153203 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k66bs\" (UniqueName: \"kubernetes.io/projected/111bf23f-be00-46ab-97fe-a36465735164-kube-api-access-k66bs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.153340 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.254205 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.254257 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/111bf23f-be00-46ab-97fe-a36465735164-config\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.254293 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b47f8b75-85d2-4ff3-8129-97e8e65a1c3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47f8b75-85d2-4ff3-8129-97e8e65a1c3b\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.254327 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/111bf23f-be00-46ab-97fe-a36465735164-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.254391 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.254454 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/111bf23f-be00-46ab-97fe-a36465735164-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.254491 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k66bs\" (UniqueName: \"kubernetes.io/projected/111bf23f-be00-46ab-97fe-a36465735164-kube-api-access-k66bs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.254591 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.255937 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/111bf23f-be00-46ab-97fe-a36465735164-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.256964 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/111bf23f-be00-46ab-97fe-a36465735164-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.258759 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.258792 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b47f8b75-85d2-4ff3-8129-97e8e65a1c3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47f8b75-85d2-4ff3-8129-97e8e65a1c3b\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5552e6bada886dd6f04199b2714e9f5be58976ce3ffc4ce0948de79ca5058217/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.262577 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.264248 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.264499 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/111bf23f-be00-46ab-97fe-a36465735164-config\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.270441 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.283740 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k66bs\" (UniqueName: \"kubernetes.io/projected/111bf23f-be00-46ab-97fe-a36465735164-kube-api-access-k66bs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.305451 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b47f8b75-85d2-4ff3-8129-97e8e65a1c3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47f8b75-85d2-4ff3-8129-97e8e65a1c3b\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.415250 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 14:19:03 crc kubenswrapper[4898]: I0313 14:19:03.558864 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 14:19:03 crc kubenswrapper[4898]: I0313 14:19:03.677585 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.702250 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.702738 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sn79l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-hk9w4_openstack(9e544d1f-357e-4751-88bb-5108430b52cb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.704406 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.707409 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.707563 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v5ll2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-nmlp6_openstack(c17db307-7a8a-4585-9696-a9ef96b6ba0b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.708825 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.709309 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.709420 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-glpmk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-npnz4_openstack(b5005de8-b440-45e8-a1a7-7943f68bff2f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.710674 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" podUID="b5005de8-b440-45e8-a1a7-7943f68bff2f" Mar 13 14:19:08 crc kubenswrapper[4898]: E0313 14:19:08.377027 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" Mar 13 14:19:08 crc kubenswrapper[4898]: E0313 14:19:08.377033 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" Mar 13 14:19:08 crc kubenswrapper[4898]: W0313 14:19:08.845656 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e010381_921d_4328_9027_ddb9a54a08bd.slice/crio-a514287f4abd02abdb35f5efc576784d286f154281545d6e3b18397fdacfa325 WatchSource:0}: Error finding container a514287f4abd02abdb35f5efc576784d286f154281545d6e3b18397fdacfa325: Status 404 returned error can't find the container with id a514287f4abd02abdb35f5efc576784d286f154281545d6e3b18397fdacfa325 Mar 13 14:19:08 crc kubenswrapper[4898]: E0313 14:19:08.877381 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 13 14:19:08 crc kubenswrapper[4898]: E0313 14:19:08.877865 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h6xth,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-5jhct_openstack(70dc5baf-6ae1-41b4-9454-8ff891570f8b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:19:08 crc kubenswrapper[4898]: E0313 14:19:08.879869 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" podUID="70dc5baf-6ae1-41b4-9454-8ff891570f8b" Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.133506 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.213367 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glpmk\" (UniqueName: \"kubernetes.io/projected/b5005de8-b440-45e8-a1a7-7943f68bff2f-kube-api-access-glpmk\") pod \"b5005de8-b440-45e8-a1a7-7943f68bff2f\" (UID: \"b5005de8-b440-45e8-a1a7-7943f68bff2f\") " Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.213529 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5005de8-b440-45e8-a1a7-7943f68bff2f-config\") pod \"b5005de8-b440-45e8-a1a7-7943f68bff2f\" (UID: \"b5005de8-b440-45e8-a1a7-7943f68bff2f\") " Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.215073 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5005de8-b440-45e8-a1a7-7943f68bff2f-config" (OuterVolumeSpecName: "config") pod "b5005de8-b440-45e8-a1a7-7943f68bff2f" (UID: "b5005de8-b440-45e8-a1a7-7943f68bff2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.219596 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5005de8-b440-45e8-a1a7-7943f68bff2f-kube-api-access-glpmk" (OuterVolumeSpecName: "kube-api-access-glpmk") pod "b5005de8-b440-45e8-a1a7-7943f68bff2f" (UID: "b5005de8-b440-45e8-a1a7-7943f68bff2f"). InnerVolumeSpecName "kube-api-access-glpmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.316133 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glpmk\" (UniqueName: \"kubernetes.io/projected/b5005de8-b440-45e8-a1a7-7943f68bff2f-kube-api-access-glpmk\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.316436 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5005de8-b440-45e8-a1a7-7943f68bff2f-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.384652 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e5d53cf3-113e-4391-b3a9-4e1f81836e26","Type":"ContainerStarted","Data":"6ec3991f9b81a553bddc7e8dd5637b2e9d2c74118a175bf76255584779d5faf2"} Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.385890 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" event={"ID":"b5005de8-b440-45e8-a1a7-7943f68bff2f","Type":"ContainerDied","Data":"64fa491215cb47b1ae2d24aaa439a281949404c6099493e0af3976068bb840f1"} Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.385954 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.388887 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4e010381-921d-4328-9027-ddb9a54a08bd","Type":"ContainerStarted","Data":"a514287f4abd02abdb35f5efc576784d286f154281545d6e3b18397fdacfa325"} Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.465602 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-npnz4"] Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.473839 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-npnz4"] Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.754935 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5005de8-b440-45e8-a1a7-7943f68bff2f" path="/var/lib/kubelet/pods/b5005de8-b440-45e8-a1a7-7943f68bff2f/volumes" Mar 13 14:19:10 crc kubenswrapper[4898]: I0313 14:19:10.222825 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-699d95d586-ds75f"] Mar 13 14:19:10 crc kubenswrapper[4898]: I0313 14:19:10.238539 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:19:10 crc kubenswrapper[4898]: I0313 14:19:10.462575 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 14:19:10 crc kubenswrapper[4898]: I0313 14:19:10.484940 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b"] Mar 13 14:19:10 crc kubenswrapper[4898]: W0313 14:19:10.534309 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab8664f8_1960_4442_9fdd_9711ec963e1f.slice/crio-75730ddce867cc2a105b1b1dc663ae54eddf209d10b2609fd0662d3aa6f813d4 WatchSource:0}: Error finding container 75730ddce867cc2a105b1b1dc663ae54eddf209d10b2609fd0662d3aa6f813d4: Status 404 returned error can't find the container with id 75730ddce867cc2a105b1b1dc663ae54eddf209d10b2609fd0662d3aa6f813d4 Mar 13 14:19:10 crc kubenswrapper[4898]: W0313 14:19:10.641804 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf526abbc_e646_48b4_afa8_7f95f4a607a0.slice/crio-e971de8bfe5dba96efe9215fb1b5480439450e667cc052958c51747ab17b2279 WatchSource:0}: Error finding container e971de8bfe5dba96efe9215fb1b5480439450e667cc052958c51747ab17b2279: Status 404 returned error can't find the container with id e971de8bfe5dba96efe9215fb1b5480439450e667cc052958c51747ab17b2279 Mar 13 14:19:10 crc kubenswrapper[4898]: I0313 14:19:10.838176 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 14:19:10 crc kubenswrapper[4898]: I0313 14:19:10.853760 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 14:19:10 crc kubenswrapper[4898]: I0313 14:19:10.861734 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j79bj"] Mar 13 14:19:10 crc kubenswrapper[4898]: I0313 14:19:10.938394 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.232197 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.377465 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6xth\" (UniqueName: \"kubernetes.io/projected/70dc5baf-6ae1-41b4-9454-8ff891570f8b-kube-api-access-h6xth\") pod \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.377557 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-dns-svc\") pod \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.377650 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-config\") pod \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.378733 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "70dc5baf-6ae1-41b4-9454-8ff891570f8b" (UID: "70dc5baf-6ae1-41b4-9454-8ff891570f8b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.379224 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-config" (OuterVolumeSpecName: "config") pod "70dc5baf-6ae1-41b4-9454-8ff891570f8b" (UID: "70dc5baf-6ae1-41b4-9454-8ff891570f8b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.386177 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70dc5baf-6ae1-41b4-9454-8ff891570f8b-kube-api-access-h6xth" (OuterVolumeSpecName: "kube-api-access-h6xth") pod "70dc5baf-6ae1-41b4-9454-8ff891570f8b" (UID: "70dc5baf-6ae1-41b4-9454-8ff891570f8b"). InnerVolumeSpecName "kube-api-access-h6xth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.410339 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"111bf23f-be00-46ab-97fe-a36465735164","Type":"ContainerStarted","Data":"c49368ae5e9c18d425c8b1f12cab5cd1fe934aeb23fb0840878f1b31b1ed9ad0"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.411731 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10","Type":"ContainerStarted","Data":"4267217032c3f6604cd9f5db4299e357d2c991800cdea8a6a36a9dcfc0d8c5b4"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.412762 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"67ef28b0-acc3-400e-8296-a541fc3b89f0","Type":"ContainerStarted","Data":"ae3b6555dcb0c381cf3215392eba070575db3d273cf0ed579abe9ea6ca84b2d1"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.413975 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ee084354-4d32-4d3c-96a4-1e4e7eef5d85","Type":"ContainerStarted","Data":"319d11416db34d4c2bde21b35bf9b79fc6c55b22cfe14271a9be5dde11f3c078"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.416929 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b","Type":"ContainerStarted","Data":"8e18090ad1757c0b15ba6a519121358ec8fea5c9816c6426d3dd165832b431af"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.418098 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699d95d586-ds75f" event={"ID":"ab8664f8-1960-4442-9fdd-9711ec963e1f","Type":"ContainerStarted","Data":"75730ddce867cc2a105b1b1dc663ae54eddf209d10b2609fd0662d3aa6f813d4"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.449597 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j79bj" event={"ID":"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe","Type":"ContainerStarted","Data":"288e2eb84d62aa584721a32f93bb793b9f73c64641090a579d0f6582ae88a0dc"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.452133 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f","Type":"ContainerStarted","Data":"70416c4b1d2425f9af76478ec404b3bc01a480bde401890ed595660f8f4ec3f7"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.471498 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"818e3f41-30c4-4a49-b490-0d868fc2b2b8","Type":"ContainerStarted","Data":"6ac94c751f27a4d12d02923377c883f4669b7b2f835e8c6d8eb98e37f2b620ef"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.476671 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f526abbc-e646-48b4-afa8-7f95f4a607a0","Type":"ContainerStarted","Data":"e971de8bfe5dba96efe9215fb1b5480439450e667cc052958c51747ab17b2279"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.479650 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6xth\" (UniqueName: \"kubernetes.io/projected/70dc5baf-6ae1-41b4-9454-8ff891570f8b-kube-api-access-h6xth\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.479675 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.479683 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.481603 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d56bd826-4f42-409d-ae41-9bfc70d1e038","Type":"ContainerStarted","Data":"cb002d235371a7e7beebe07dd448307d31c6dae66e8fbd1dd6c0c499e634cca9"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.484631 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" event={"ID":"70dc5baf-6ae1-41b4-9454-8ff891570f8b","Type":"ContainerDied","Data":"41cc016dc80d4b785612e7bb953345c49daec5b78f73d536749225e85b77217d"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.484754 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.495448 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" event={"ID":"ad052248-8fcd-4ef6-9969-5023b87bbbf9","Type":"ContainerStarted","Data":"802713d4b4783f0e385efee7ea2662a5e8b02ad36e51a0e58b61695a8eb8808e"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.533101 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-r9tmf"] Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.589428 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5jhct"] Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.596928 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5jhct"] Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.750190 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70dc5baf-6ae1-41b4-9454-8ff891570f8b" path="/var/lib/kubelet/pods/70dc5baf-6ae1-41b4-9454-8ff891570f8b/volumes" Mar 13 14:19:13 crc kubenswrapper[4898]: W0313 14:19:13.365175 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf71b72a8_f179_454c_8d2e_4ac829842622.slice/crio-8a049a1d82afb7e4921f4fb9b45ea37c110c2288d9057d078bdbde6ee61f93e1 WatchSource:0}: Error finding container 8a049a1d82afb7e4921f4fb9b45ea37c110c2288d9057d078bdbde6ee61f93e1: Status 404 returned error can't find the container with id 8a049a1d82afb7e4921f4fb9b45ea37c110c2288d9057d078bdbde6ee61f93e1 Mar 13 14:19:13 crc kubenswrapper[4898]: I0313 14:19:13.536874 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r9tmf" event={"ID":"f71b72a8-f179-454c-8d2e-4ac829842622","Type":"ContainerStarted","Data":"8a049a1d82afb7e4921f4fb9b45ea37c110c2288d9057d078bdbde6ee61f93e1"} Mar 13 14:19:14 crc kubenswrapper[4898]: I0313 14:19:14.552031 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699d95d586-ds75f" event={"ID":"ab8664f8-1960-4442-9fdd-9711ec963e1f","Type":"ContainerStarted","Data":"c65020321e952c46cfee20714212dd17d9bd0026593d0fedff27cbf44cc73c5e"} Mar 13 14:19:14 crc kubenswrapper[4898]: I0313 14:19:14.574411 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-699d95d586-ds75f" podStartSLOduration=21.574393774 podStartE2EDuration="21.574393774s" podCreationTimestamp="2026-03-13 14:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:14.56849509 +0000 UTC m=+1389.570083339" watchObservedRunningTime="2026-03-13 14:19:14.574393774 +0000 UTC m=+1389.575982013" Mar 13 14:19:18 crc kubenswrapper[4898]: I0313 14:19:18.587727 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e5d53cf3-113e-4391-b3a9-4e1f81836e26","Type":"ContainerStarted","Data":"6dc134313604b68c8ab10e2c392729441b9bc5e45ec849a53ceed430e93e429c"} Mar 13 14:19:18 crc kubenswrapper[4898]: I0313 14:19:18.590204 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"67ef28b0-acc3-400e-8296-a541fc3b89f0","Type":"ContainerStarted","Data":"622d6a713b41e4b6a008d0e3e85be32b17a6ccd982c87f9e945074ffe15804c7"} Mar 13 14:19:18 crc kubenswrapper[4898]: I0313 14:19:18.590311 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 13 14:19:18 crc kubenswrapper[4898]: I0313 14:19:18.592553 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f","Type":"ContainerStarted","Data":"54cdc8923dc9b1d51b7601a430097ec42b0a894296bfedbe8fc5fc74a3adf43a"} Mar 13 14:19:18 crc kubenswrapper[4898]: I0313 14:19:18.678096 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=23.33474511 podStartE2EDuration="29.678064569s" podCreationTimestamp="2026-03-13 14:18:49 +0000 UTC" firstStartedPulling="2026-03-13 14:19:11.131392552 +0000 UTC m=+1386.132980791" lastFinishedPulling="2026-03-13 14:19:17.474712001 +0000 UTC m=+1392.476300250" observedRunningTime="2026-03-13 14:19:18.671501619 +0000 UTC m=+1393.673089978" watchObservedRunningTime="2026-03-13 14:19:18.678064569 +0000 UTC m=+1393.679652818" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.134368 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.134633 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.134671 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.135208 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37bdbe6f1a65f1530746827b4e6d1dd1ce95edb9a913051fc8fca9a782787e56"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.135258 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://37bdbe6f1a65f1530746827b4e6d1dd1ce95edb9a913051fc8fca9a782787e56" gracePeriod=600 Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.602959 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4e010381-921d-4328-9027-ddb9a54a08bd","Type":"ContainerStarted","Data":"71740b094889fce6ef9ef07ec41cbfdf46a3a1807d9a456b2458bac02fa10682"} Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.603311 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.605335 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j79bj" event={"ID":"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe","Type":"ContainerStarted","Data":"b8b9af01ef6e79d983db7f1a267b8c0ddee0b64921a80f09baaea6eeb244cf39"} Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.605477 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-j79bj" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.611522 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" event={"ID":"ad052248-8fcd-4ef6-9969-5023b87bbbf9","Type":"ContainerStarted","Data":"715044d0f82cba4ad9ee7589454dd020782348cdb3c4d3ef7b2ac338f04494fc"} Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.624026 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"111bf23f-be00-46ab-97fe-a36465735164","Type":"ContainerStarted","Data":"1bdc27d4998e62d32f402f9a321a4b390dbc37da79acbfdca74ec1b98d4b82b8"} Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.626796 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=19.021975356 podStartE2EDuration="27.626777686s" podCreationTimestamp="2026-03-13 14:18:52 +0000 UTC" firstStartedPulling="2026-03-13 14:19:08.870378584 +0000 UTC m=+1383.871966823" lastFinishedPulling="2026-03-13 14:19:17.475180914 +0000 UTC m=+1392.476769153" observedRunningTime="2026-03-13 14:19:19.616885308 +0000 UTC m=+1394.618473557" watchObservedRunningTime="2026-03-13 14:19:19.626777686 +0000 UTC m=+1394.628365925" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.629164 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="37bdbe6f1a65f1530746827b4e6d1dd1ce95edb9a913051fc8fca9a782787e56" exitCode=0 Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.629220 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"37bdbe6f1a65f1530746827b4e6d1dd1ce95edb9a913051fc8fca9a782787e56"} Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.629244 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc"} Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.629264 4898 scope.go:117] "RemoveContainer" containerID="7b5d3972dfd92a1b971338153ac5467cf67b2057ca35cfb382b56be42ddca2ed" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.632555 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10","Type":"ContainerStarted","Data":"67f45ce6b97f65b84a75fb6f6c2bea4e76e105439c28870beec523190f47a249"} Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.641172 4898 generic.go:334] "Generic (PLEG): container finished" podID="f71b72a8-f179-454c-8d2e-4ac829842622" containerID="8a94a2d331eb5334e6859d4ae92c26ca4ea0f2be05f7a83373164a2e1e6a9044" exitCode=0 Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.641304 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r9tmf" event={"ID":"f71b72a8-f179-454c-8d2e-4ac829842622","Type":"ContainerDied","Data":"8a94a2d331eb5334e6859d4ae92c26ca4ea0f2be05f7a83373164a2e1e6a9044"} Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.664348 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-j79bj" podStartSLOduration=19.28630987 podStartE2EDuration="25.664301821s" podCreationTimestamp="2026-03-13 14:18:54 +0000 UTC" firstStartedPulling="2026-03-13 14:19:11.141188777 +0000 UTC m=+1386.142777016" lastFinishedPulling="2026-03-13 14:19:17.519180728 +0000 UTC m=+1392.520768967" observedRunningTime="2026-03-13 14:19:19.632631558 +0000 UTC m=+1394.634219817" watchObservedRunningTime="2026-03-13 14:19:19.664301821 +0000 UTC m=+1394.665890060" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.671977 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" podStartSLOduration=21.31782317 podStartE2EDuration="27.671961901s" podCreationTimestamp="2026-03-13 14:18:52 +0000 UTC" firstStartedPulling="2026-03-13 14:19:11.131346921 +0000 UTC m=+1386.132935160" lastFinishedPulling="2026-03-13 14:19:17.485485652 +0000 UTC m=+1392.487073891" observedRunningTime="2026-03-13 14:19:19.645846061 +0000 UTC m=+1394.647434320" watchObservedRunningTime="2026-03-13 14:19:19.671961901 +0000 UTC m=+1394.673550140" Mar 13 14:19:20 crc kubenswrapper[4898]: I0313 14:19:20.654490 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r9tmf" event={"ID":"f71b72a8-f179-454c-8d2e-4ac829842622","Type":"ContainerStarted","Data":"30c5b975cfcebf4558f9c261801ff9f8e434560b7c84e19be8f58b704459d95f"} Mar 13 14:19:21 crc kubenswrapper[4898]: I0313 14:19:21.667981 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f526abbc-e646-48b4-afa8-7f95f4a607a0","Type":"ContainerStarted","Data":"17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759"} Mar 13 14:19:21 crc kubenswrapper[4898]: I0313 14:19:21.668113 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f526abbc-e646-48b4-afa8-7f95f4a607a0" containerName="init-config-reloader" containerID="cri-o://17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759" gracePeriod=600 Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.682945 4898 generic.go:334] "Generic (PLEG): container finished" podID="9e544d1f-357e-4751-88bb-5108430b52cb" containerID="721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8" exitCode=0 Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.683004 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" event={"ID":"9e544d1f-357e-4751-88bb-5108430b52cb","Type":"ContainerDied","Data":"721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8"} Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.686840 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r9tmf" event={"ID":"f71b72a8-f179-454c-8d2e-4ac829842622","Type":"ContainerStarted","Data":"9b97742be64da9f855d3a6747c7c0629c74485bfa790a873f4b95940c778cd26"} Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.687088 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.691164 4898 generic.go:334] "Generic (PLEG): container finished" podID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerID="54cdc8923dc9b1d51b7601a430097ec42b0a894296bfedbe8fc5fc74a3adf43a" exitCode=0 Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.691257 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f","Type":"ContainerDied","Data":"54cdc8923dc9b1d51b7601a430097ec42b0a894296bfedbe8fc5fc74a3adf43a"} Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.694579 4898 generic.go:334] "Generic (PLEG): container finished" podID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerID="6dc134313604b68c8ab10e2c392729441b9bc5e45ec849a53ceed430e93e429c" exitCode=0 Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.694675 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e5d53cf3-113e-4391-b3a9-4e1f81836e26","Type":"ContainerDied","Data":"6dc134313604b68c8ab10e2c392729441b9bc5e45ec849a53ceed430e93e429c"} Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.696700 4898 generic.go:334] "Generic (PLEG): container finished" podID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" containerID="74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459" exitCode=0 Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.696752 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" event={"ID":"c17db307-7a8a-4585-9696-a9ef96b6ba0b","Type":"ContainerDied","Data":"74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459"} Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.702380 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"111bf23f-be00-46ab-97fe-a36465735164","Type":"ContainerStarted","Data":"533a46a37109aea2ff8c47c6024b08c433252c4df0234d2dfe06d45bb30ec92e"} Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.707703 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10","Type":"ContainerStarted","Data":"39bfdffc1e0fb88186d147830302dd4ca12b23b3d4541787d52288ae77f2c1f5"} Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.763853 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-r9tmf" podStartSLOduration=24.646436727 podStartE2EDuration="28.76383075s" podCreationTimestamp="2026-03-13 14:18:54 +0000 UTC" firstStartedPulling="2026-03-13 14:19:13.367597956 +0000 UTC m=+1388.369186185" lastFinishedPulling="2026-03-13 14:19:17.484991969 +0000 UTC m=+1392.486580208" observedRunningTime="2026-03-13 14:19:22.750944815 +0000 UTC m=+1397.752533084" watchObservedRunningTime="2026-03-13 14:19:22.76383075 +0000 UTC m=+1397.765418989" Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.775508 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.789283 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.934631386 podStartE2EDuration="28.789262422s" podCreationTimestamp="2026-03-13 14:18:54 +0000 UTC" firstStartedPulling="2026-03-13 14:19:11.141478495 +0000 UTC m=+1386.143066734" lastFinishedPulling="2026-03-13 14:19:21.996109541 +0000 UTC m=+1396.997697770" observedRunningTime="2026-03-13 14:19:22.776563981 +0000 UTC m=+1397.778152240" watchObservedRunningTime="2026-03-13 14:19:22.789262422 +0000 UTC m=+1397.790850671" Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.845812 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=13.934942373 podStartE2EDuration="24.845794872s" podCreationTimestamp="2026-03-13 14:18:58 +0000 UTC" firstStartedPulling="2026-03-13 14:19:11.145293434 +0000 UTC m=+1386.146881673" lastFinishedPulling="2026-03-13 14:19:22.056145933 +0000 UTC m=+1397.057734172" observedRunningTime="2026-03-13 14:19:22.83610698 +0000 UTC m=+1397.837695239" watchObservedRunningTime="2026-03-13 14:19:22.845794872 +0000 UTC m=+1397.847383111" Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.848858 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.416207 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.472464 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.625226 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.625285 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.629590 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.720493 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f","Type":"ContainerStarted","Data":"5603fe1e36b001c885a705c95d9e330d44a889a46589d0677416ea907bf21af1"} Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.722379 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e5d53cf3-113e-4391-b3a9-4e1f81836e26","Type":"ContainerStarted","Data":"754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c"} Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.724550 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" event={"ID":"c17db307-7a8a-4585-9696-a9ef96b6ba0b","Type":"ContainerStarted","Data":"16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11"} Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.724782 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.728059 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" event={"ID":"9e544d1f-357e-4751-88bb-5108430b52cb","Type":"ContainerStarted","Data":"11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63"} Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.728684 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.728721 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.728744 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.734077 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.753195 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=30.257275855 podStartE2EDuration="35.753165493s" podCreationTimestamp="2026-03-13 14:18:48 +0000 UTC" firstStartedPulling="2026-03-13 14:19:11.135811237 +0000 UTC m=+1386.137399476" lastFinishedPulling="2026-03-13 14:19:16.631700875 +0000 UTC m=+1391.633289114" observedRunningTime="2026-03-13 14:19:23.752962558 +0000 UTC m=+1398.754550837" watchObservedRunningTime="2026-03-13 14:19:23.753165493 +0000 UTC m=+1398.754753732" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.783421 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=32.029049091 podStartE2EDuration="37.783405s" podCreationTimestamp="2026-03-13 14:18:46 +0000 UTC" firstStartedPulling="2026-03-13 14:19:08.869949643 +0000 UTC m=+1383.871537882" lastFinishedPulling="2026-03-13 14:19:14.624305552 +0000 UTC m=+1389.625893791" observedRunningTime="2026-03-13 14:19:23.777330812 +0000 UTC m=+1398.778919081" watchObservedRunningTime="2026-03-13 14:19:23.783405 +0000 UTC m=+1398.784993229" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.787531 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.790449 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.842573 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" podStartSLOduration=3.833183793 podStartE2EDuration="38.842550078s" podCreationTimestamp="2026-03-13 14:18:45 +0000 UTC" firstStartedPulling="2026-03-13 14:18:46.691640601 +0000 UTC m=+1361.693228840" lastFinishedPulling="2026-03-13 14:19:21.701006886 +0000 UTC m=+1396.702595125" observedRunningTime="2026-03-13 14:19:23.814743565 +0000 UTC m=+1398.816331814" watchObservedRunningTime="2026-03-13 14:19:23.842550078 +0000 UTC m=+1398.844138307" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.861953 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6ddbb5776b-mx8sz"] Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.863541 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" podStartSLOduration=-9223371997.991253 podStartE2EDuration="38.863523034s" podCreationTimestamp="2026-03-13 14:18:45 +0000 UTC" firstStartedPulling="2026-03-13 14:18:46.354460331 +0000 UTC m=+1361.356048570" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:23.845835753 +0000 UTC m=+1398.847424002" watchObservedRunningTime="2026-03-13 14:19:23.863523034 +0000 UTC m=+1398.865111273" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.136338 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hk9w4"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.191451 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-8mxxb"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.193047 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.195212 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.202441 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gm2pz"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.204076 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.214007 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.214610 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8mxxb"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.228717 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gm2pz"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.330596 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nmlp6"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.343989 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344082 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/515cda05-1d7b-4252-94fc-056b38ec502a-ovn-rundir\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344111 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdcst\" (UniqueName: \"kubernetes.io/projected/cddd3df9-e505-4f25-988d-8cba87eaefbe-kube-api-access-cdcst\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344155 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344180 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-config\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344224 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn2hr\" (UniqueName: \"kubernetes.io/projected/515cda05-1d7b-4252-94fc-056b38ec502a-kube-api-access-zn2hr\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344262 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/515cda05-1d7b-4252-94fc-056b38ec502a-ovs-rundir\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344280 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/515cda05-1d7b-4252-94fc-056b38ec502a-config\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344298 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/515cda05-1d7b-4252-94fc-056b38ec502a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344326 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515cda05-1d7b-4252-94fc-056b38ec502a-combined-ca-bundle\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.381072 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9tcr"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.382949 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.385488 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.397013 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9tcr"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.424748 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.431868 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.436100 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-nwbfn" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.436262 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.436414 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.446242 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447123 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/515cda05-1d7b-4252-94fc-056b38ec502a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447194 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447226 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515cda05-1d7b-4252-94fc-056b38ec502a-combined-ca-bundle\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447283 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447310 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447344 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-config\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447367 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/515cda05-1d7b-4252-94fc-056b38ec502a-ovn-rundir\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447391 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447415 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdcst\" (UniqueName: \"kubernetes.io/projected/cddd3df9-e505-4f25-988d-8cba87eaefbe-kube-api-access-cdcst\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.448068 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.448132 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-config\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.448185 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn2hr\" (UniqueName: \"kubernetes.io/projected/515cda05-1d7b-4252-94fc-056b38ec502a-kube-api-access-zn2hr\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.448228 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dz9f\" (UniqueName: \"kubernetes.io/projected/00d79476-a8c0-4bad-81ae-6b50afea8601-kube-api-access-9dz9f\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.448272 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/515cda05-1d7b-4252-94fc-056b38ec502a-ovs-rundir\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.448294 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/515cda05-1d7b-4252-94fc-056b38ec502a-config\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.449081 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/515cda05-1d7b-4252-94fc-056b38ec502a-config\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.449489 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/515cda05-1d7b-4252-94fc-056b38ec502a-ovs-rundir\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.449827 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/515cda05-1d7b-4252-94fc-056b38ec502a-ovn-rundir\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.450558 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.450962 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-config\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.452452 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.454331 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/515cda05-1d7b-4252-94fc-056b38ec502a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.466024 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515cda05-1d7b-4252-94fc-056b38ec502a-combined-ca-bundle\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.469692 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.470393 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdcst\" (UniqueName: \"kubernetes.io/projected/cddd3df9-e505-4f25-988d-8cba87eaefbe-kube-api-access-cdcst\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.484568 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn2hr\" (UniqueName: \"kubernetes.io/projected/515cda05-1d7b-4252-94fc-056b38ec502a-kube-api-access-zn2hr\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.522122 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.533453 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.549609 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.549655 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dz9f\" (UniqueName: \"kubernetes.io/projected/00d79476-a8c0-4bad-81ae-6b50afea8601-kube-api-access-9dz9f\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.549688 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.549716 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.549734 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.549979 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.550121 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-config\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.550175 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/902753c9-2101-4509-9283-55070ac3787e-scripts\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.550245 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.550299 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902753c9-2101-4509-9283-55070ac3787e-config\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.550394 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px22g\" (UniqueName: \"kubernetes.io/projected/902753c9-2101-4509-9283-55070ac3787e-kube-api-access-px22g\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.550420 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.550979 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.551215 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.551278 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/902753c9-2101-4509-9283-55070ac3787e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.551485 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-config\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.569103 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dz9f\" (UniqueName: \"kubernetes.io/projected/00d79476-a8c0-4bad-81ae-6b50afea8601-kube-api-access-9dz9f\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.652650 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px22g\" (UniqueName: \"kubernetes.io/projected/902753c9-2101-4509-9283-55070ac3787e-kube-api-access-px22g\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.652731 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/902753c9-2101-4509-9283-55070ac3787e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.652800 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.652845 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.652880 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.653015 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/902753c9-2101-4509-9283-55070ac3787e-scripts\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.653077 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902753c9-2101-4509-9283-55070ac3787e-config\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.654213 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902753c9-2101-4509-9283-55070ac3787e-config\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.654494 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/902753c9-2101-4509-9283-55070ac3787e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.657432 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/902753c9-2101-4509-9283-55070ac3787e-scripts\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.658016 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.658509 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.662166 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.697615 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px22g\" (UniqueName: \"kubernetes.io/projected/902753c9-2101-4509-9283-55070ac3787e-kube-api-access-px22g\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.716333 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.746684 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" containerName="dnsmasq-dns" containerID="cri-o://11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63" gracePeriod=10 Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.748317 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.757474 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.094032 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8mxxb"] Mar 13 14:19:25 crc kubenswrapper[4898]: W0313 14:19:25.101025 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod515cda05_1d7b_4252_94fc_056b38ec502a.slice/crio-9c740788f5ec1b4344a55a31830a61a598669277be8884a983f0c8bc3cb2564b WatchSource:0}: Error finding container 9c740788f5ec1b4344a55a31830a61a598669277be8884a983f0c8bc3cb2564b: Status 404 returned error can't find the container with id 9c740788f5ec1b4344a55a31830a61a598669277be8884a983f0c8bc3cb2564b Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.110136 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gm2pz"] Mar 13 14:19:25 crc kubenswrapper[4898]: W0313 14:19:25.119693 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcddd3df9_e505_4f25_988d_8cba87eaefbe.slice/crio-e6844355e00ec9e8aa5161a17239acb337d4e49d9f07d0b8c54a762df1aef8dd WatchSource:0}: Error finding container e6844355e00ec9e8aa5161a17239acb337d4e49d9f07d0b8c54a762df1aef8dd: Status 404 returned error can't find the container with id e6844355e00ec9e8aa5161a17239acb337d4e49d9f07d0b8c54a762df1aef8dd Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.174879 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.283526 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9tcr"] Mar 13 14:19:25 crc kubenswrapper[4898]: W0313 14:19:25.298145 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00d79476_a8c0_4bad_81ae_6b50afea8601.slice/crio-feb144c4ab58e7daafcb4fa68f1d3e330823c649be75d07c3fb9eb354883c85d WatchSource:0}: Error finding container feb144c4ab58e7daafcb4fa68f1d3e330823c649be75d07c3fb9eb354883c85d: Status 404 returned error can't find the container with id feb144c4ab58e7daafcb4fa68f1d3e330823c649be75d07c3fb9eb354883c85d Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.353525 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.470880 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.484389 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-config\") pod \"9e544d1f-357e-4751-88bb-5108430b52cb\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " Mar 13 14:19:25 crc kubenswrapper[4898]: W0313 14:19:25.484484 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod902753c9_2101_4509_9283_55070ac3787e.slice/crio-d03fba021bd551e23b93891ec34c8044d3000501b33c6bdfc21ec4cabe76a104 WatchSource:0}: Error finding container d03fba021bd551e23b93891ec34c8044d3000501b33c6bdfc21ec4cabe76a104: Status 404 returned error can't find the container with id d03fba021bd551e23b93891ec34c8044d3000501b33c6bdfc21ec4cabe76a104 Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.484523 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-dns-svc\") pod \"9e544d1f-357e-4751-88bb-5108430b52cb\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.484638 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn79l\" (UniqueName: \"kubernetes.io/projected/9e544d1f-357e-4751-88bb-5108430b52cb-kube-api-access-sn79l\") pod \"9e544d1f-357e-4751-88bb-5108430b52cb\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.488350 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e544d1f-357e-4751-88bb-5108430b52cb-kube-api-access-sn79l" (OuterVolumeSpecName: "kube-api-access-sn79l") pod "9e544d1f-357e-4751-88bb-5108430b52cb" (UID: "9e544d1f-357e-4751-88bb-5108430b52cb"). InnerVolumeSpecName "kube-api-access-sn79l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.540134 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-config" (OuterVolumeSpecName: "config") pod "9e544d1f-357e-4751-88bb-5108430b52cb" (UID: "9e544d1f-357e-4751-88bb-5108430b52cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.543923 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e544d1f-357e-4751-88bb-5108430b52cb" (UID: "9e544d1f-357e-4751-88bb-5108430b52cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.586863 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.586912 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.586925 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn79l\" (UniqueName: \"kubernetes.io/projected/9e544d1f-357e-4751-88bb-5108430b52cb-kube-api-access-sn79l\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.754978 4898 generic.go:334] "Generic (PLEG): container finished" podID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerID="31ee06ecd554c7204a7ea9b6ed5158bbdc38532b41e7043447da0dc27f024036" exitCode=0 Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.755071 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" event={"ID":"cddd3df9-e505-4f25-988d-8cba87eaefbe","Type":"ContainerDied","Data":"31ee06ecd554c7204a7ea9b6ed5158bbdc38532b41e7043447da0dc27f024036"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.755125 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" event={"ID":"cddd3df9-e505-4f25-988d-8cba87eaefbe","Type":"ContainerStarted","Data":"e6844355e00ec9e8aa5161a17239acb337d4e49d9f07d0b8c54a762df1aef8dd"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.759949 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8mxxb" event={"ID":"515cda05-1d7b-4252-94fc-056b38ec502a","Type":"ContainerStarted","Data":"ccd672988c388ccd2ea73f95bc83004ef499dd26465d5aa436d1c9ff89369cce"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.760037 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8mxxb" event={"ID":"515cda05-1d7b-4252-94fc-056b38ec502a","Type":"ContainerStarted","Data":"9c740788f5ec1b4344a55a31830a61a598669277be8884a983f0c8bc3cb2564b"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.762309 4898 generic.go:334] "Generic (PLEG): container finished" podID="00d79476-a8c0-4bad-81ae-6b50afea8601" containerID="994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be" exitCode=0 Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.762434 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" event={"ID":"00d79476-a8c0-4bad-81ae-6b50afea8601","Type":"ContainerDied","Data":"994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.762482 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" event={"ID":"00d79476-a8c0-4bad-81ae-6b50afea8601","Type":"ContainerStarted","Data":"feb144c4ab58e7daafcb4fa68f1d3e330823c649be75d07c3fb9eb354883c85d"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.765529 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"902753c9-2101-4509-9283-55070ac3787e","Type":"ContainerStarted","Data":"d03fba021bd551e23b93891ec34c8044d3000501b33c6bdfc21ec4cabe76a104"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.768891 4898 generic.go:334] "Generic (PLEG): container finished" podID="9e544d1f-357e-4751-88bb-5108430b52cb" containerID="11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63" exitCode=0 Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.769934 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" event={"ID":"9e544d1f-357e-4751-88bb-5108430b52cb","Type":"ContainerDied","Data":"11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.769999 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" event={"ID":"9e544d1f-357e-4751-88bb-5108430b52cb","Type":"ContainerDied","Data":"f4e0bf3960a9198c7dbd49808ca83e6770f6cbaf7ea545029dc2173d4eb03419"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.770025 4898 scope.go:117] "RemoveContainer" containerID="11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.770223 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.771491 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" containerName="dnsmasq-dns" containerID="cri-o://16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11" gracePeriod=10 Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.959324 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-8mxxb" podStartSLOduration=1.959293324 podStartE2EDuration="1.959293324s" podCreationTimestamp="2026-03-13 14:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:25.902508727 +0000 UTC m=+1400.904096966" watchObservedRunningTime="2026-03-13 14:19:25.959293324 +0000 UTC m=+1400.960881563" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.991099 4898 scope.go:117] "RemoveContainer" containerID="721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.002985 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hk9w4"] Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.011034 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hk9w4"] Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.077363 4898 scope.go:117] "RemoveContainer" containerID="11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63" Mar 13 14:19:26 crc kubenswrapper[4898]: E0313 14:19:26.078217 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63\": container with ID starting with 11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63 not found: ID does not exist" containerID="11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.078260 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63"} err="failed to get container status \"11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63\": rpc error: code = NotFound desc = could not find container \"11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63\": container with ID starting with 11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63 not found: ID does not exist" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.078286 4898 scope.go:117] "RemoveContainer" containerID="721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8" Mar 13 14:19:26 crc kubenswrapper[4898]: E0313 14:19:26.078701 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8\": container with ID starting with 721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8 not found: ID does not exist" containerID="721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.078727 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8"} err="failed to get container status \"721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8\": rpc error: code = NotFound desc = could not find container \"721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8\": container with ID starting with 721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8 not found: ID does not exist" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.293411 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.405469 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-config\") pod \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.405572 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5ll2\" (UniqueName: \"kubernetes.io/projected/c17db307-7a8a-4585-9696-a9ef96b6ba0b-kube-api-access-v5ll2\") pod \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.405676 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-dns-svc\") pod \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.409865 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c17db307-7a8a-4585-9696-a9ef96b6ba0b-kube-api-access-v5ll2" (OuterVolumeSpecName: "kube-api-access-v5ll2") pod "c17db307-7a8a-4585-9696-a9ef96b6ba0b" (UID: "c17db307-7a8a-4585-9696-a9ef96b6ba0b"). InnerVolumeSpecName "kube-api-access-v5ll2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.452144 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-config" (OuterVolumeSpecName: "config") pod "c17db307-7a8a-4585-9696-a9ef96b6ba0b" (UID: "c17db307-7a8a-4585-9696-a9ef96b6ba0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.452194 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c17db307-7a8a-4585-9696-a9ef96b6ba0b" (UID: "c17db307-7a8a-4585-9696-a9ef96b6ba0b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.507986 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5ll2\" (UniqueName: \"kubernetes.io/projected/c17db307-7a8a-4585-9696-a9ef96b6ba0b-kube-api-access-v5ll2\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.508330 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.508345 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.789060 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" event={"ID":"cddd3df9-e505-4f25-988d-8cba87eaefbe","Type":"ContainerStarted","Data":"7b56acf87143f75eb70ec0471ca713a46b1d9ecd5c78e828b6dff4507ed29234"} Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.789145 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.800006 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" event={"ID":"00d79476-a8c0-4bad-81ae-6b50afea8601","Type":"ContainerStarted","Data":"47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d"} Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.800883 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.803261 4898 generic.go:334] "Generic (PLEG): container finished" podID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" containerID="16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11" exitCode=0 Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.803305 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.803356 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" event={"ID":"c17db307-7a8a-4585-9696-a9ef96b6ba0b","Type":"ContainerDied","Data":"16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11"} Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.803399 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" event={"ID":"c17db307-7a8a-4585-9696-a9ef96b6ba0b","Type":"ContainerDied","Data":"98527ac55b34245c21c2fc19c06bf03fb117bb3cf7f7b539444d59d7d2dad50b"} Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.803420 4898 scope.go:117] "RemoveContainer" containerID="16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.815171 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" podStartSLOduration=2.815149755 podStartE2EDuration="2.815149755s" podCreationTimestamp="2026-03-13 14:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:26.807729612 +0000 UTC m=+1401.809317881" watchObservedRunningTime="2026-03-13 14:19:26.815149755 +0000 UTC m=+1401.816737994" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.834928 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" podStartSLOduration=2.834891948 podStartE2EDuration="2.834891948s" podCreationTimestamp="2026-03-13 14:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:26.833642646 +0000 UTC m=+1401.835230885" watchObservedRunningTime="2026-03-13 14:19:26.834891948 +0000 UTC m=+1401.836480187" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.863073 4898 scope.go:117] "RemoveContainer" containerID="74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.863749 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nmlp6"] Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.872726 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nmlp6"] Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.883056 4898 scope.go:117] "RemoveContainer" containerID="16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11" Mar 13 14:19:26 crc kubenswrapper[4898]: E0313 14:19:26.883362 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11\": container with ID starting with 16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11 not found: ID does not exist" containerID="16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.883401 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11"} err="failed to get container status \"16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11\": rpc error: code = NotFound desc = could not find container \"16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11\": container with ID starting with 16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11 not found: ID does not exist" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.883425 4898 scope.go:117] "RemoveContainer" containerID="74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459" Mar 13 14:19:26 crc kubenswrapper[4898]: E0313 14:19:26.883698 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459\": container with ID starting with 74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459 not found: ID does not exist" containerID="74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.883792 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459"} err="failed to get container status \"74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459\": rpc error: code = NotFound desc = could not find container \"74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459\": container with ID starting with 74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459 not found: ID does not exist" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.749776 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" path="/var/lib/kubelet/pods/9e544d1f-357e-4751-88bb-5108430b52cb/volumes" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.750934 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" path="/var/lib/kubelet/pods/c17db307-7a8a-4585-9696-a9ef96b6ba0b/volumes" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.803033 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.823786 4898 generic.go:334] "Generic (PLEG): container finished" podID="f526abbc-e646-48b4-afa8-7f95f4a607a0" containerID="17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759" exitCode=0 Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.823852 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f526abbc-e646-48b4-afa8-7f95f4a607a0","Type":"ContainerDied","Data":"17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759"} Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.823877 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f526abbc-e646-48b4-afa8-7f95f4a607a0","Type":"ContainerDied","Data":"e971de8bfe5dba96efe9215fb1b5480439450e667cc052958c51747ab17b2279"} Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.823892 4898 scope.go:117] "RemoveContainer" containerID="17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.824072 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.829998 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"902753c9-2101-4509-9283-55070ac3787e","Type":"ContainerStarted","Data":"1bdae4bc751b2513b85e4ba091ea769293b3833999c8d6cb0b7db96c8bc2c834"} Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.830231 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"902753c9-2101-4509-9283-55070ac3787e","Type":"ContainerStarted","Data":"5488e748631ff0ff74dc6fc92126c866c6ea137b2852a96488c1bb4a5888ebfa"} Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.830735 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.836020 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68mzv\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-kube-api-access-68mzv\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.836092 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-config\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.836261 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-web-config\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.836326 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f526abbc-e646-48b4-afa8-7f95f4a607a0-config-out\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.836395 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-thanos-prometheus-http-client-file\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.837766 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-1\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.837798 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-tls-assets\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.837953 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.837989 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-0\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.838018 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-2\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.840256 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-kube-api-access-68mzv" (OuterVolumeSpecName: "kube-api-access-68mzv") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "kube-api-access-68mzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.845013 4898 scope.go:117] "RemoveContainer" containerID="17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759" Mar 13 14:19:27 crc kubenswrapper[4898]: E0313 14:19:27.847422 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759\": container with ID starting with 17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759 not found: ID does not exist" containerID="17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.847463 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759"} err="failed to get container status \"17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759\": rpc error: code = NotFound desc = could not find container \"17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759\": container with ID starting with 17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759 not found: ID does not exist" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.848743 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68mzv\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-kube-api-access-68mzv\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.851407 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.853164 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-web-config" (OuterVolumeSpecName: "web-config") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.854741 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f526abbc-e646-48b4-afa8-7f95f4a607a0-config-out" (OuterVolumeSpecName: "config-out") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.855463 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.856383 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.856471 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.872261 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-config" (OuterVolumeSpecName: "config") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.876298 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.885611 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "pvc-537e992e-0c7e-4e28-8105-b535a72a793c". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.888596 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.491753353 podStartE2EDuration="3.888457141s" podCreationTimestamp="2026-03-13 14:19:24 +0000 UTC" firstStartedPulling="2026-03-13 14:19:25.488972651 +0000 UTC m=+1400.490560890" lastFinishedPulling="2026-03-13 14:19:26.885676439 +0000 UTC m=+1401.887264678" observedRunningTime="2026-03-13 14:19:27.865300879 +0000 UTC m=+1402.866889118" watchObservedRunningTime="2026-03-13 14:19:27.888457141 +0000 UTC m=+1402.890045380" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951496 4898 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f526abbc-e646-48b4-afa8-7f95f4a607a0-config-out\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951529 4898 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951559 4898 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951571 4898 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951594 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") on node \"crc\" " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951605 4898 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951615 4898 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951624 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951634 4898 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-web-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.976087 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.976257 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-537e992e-0c7e-4e28-8105-b535a72a793c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c") on node "crc" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.053742 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.220975 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.224957 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.254874 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:19:28 crc kubenswrapper[4898]: E0313 14:19:28.258431 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" containerName="init" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.258458 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" containerName="init" Mar 13 14:19:28 crc kubenswrapper[4898]: E0313 14:19:28.258467 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" containerName="dnsmasq-dns" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.258474 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" containerName="dnsmasq-dns" Mar 13 14:19:28 crc kubenswrapper[4898]: E0313 14:19:28.258501 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" containerName="dnsmasq-dns" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.258508 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" containerName="dnsmasq-dns" Mar 13 14:19:28 crc kubenswrapper[4898]: E0313 14:19:28.258523 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" containerName="init" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.258529 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" containerName="init" Mar 13 14:19:28 crc kubenswrapper[4898]: E0313 14:19:28.258542 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f526abbc-e646-48b4-afa8-7f95f4a607a0" containerName="init-config-reloader" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.258548 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f526abbc-e646-48b4-afa8-7f95f4a607a0" containerName="init-config-reloader" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.258727 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" containerName="dnsmasq-dns" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.258747 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f526abbc-e646-48b4-afa8-7f95f4a607a0" containerName="init-config-reloader" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.258757 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" containerName="dnsmasq-dns" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.260487 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.268158 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.268237 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-g7dw2" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.268258 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.268791 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.268946 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.269049 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.268870 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.274183 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.290956 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361003 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361379 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jxl2\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-kube-api-access-5jxl2\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361415 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361444 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361474 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361620 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361650 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361674 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361726 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361751 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463522 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463569 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463590 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463620 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463642 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463731 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463759 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jxl2\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-kube-api-access-5jxl2\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463782 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463804 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463827 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.464733 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.465063 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.465214 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.467970 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.468016 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7123a2111c2c1fcd673a2fa4cbaef2c14fcdb159a9a269edbe99c5cdea18ee2d/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.470580 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.471043 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.471940 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.472006 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.472290 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.487643 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jxl2\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-kube-api-access-5jxl2\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.514016 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.578476 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.618447 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.618676 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 13 14:19:29 crc kubenswrapper[4898]: W0313 14:19:29.171591 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e6f6f0d_db24_4fdb_a872_ce2c527a791b.slice/crio-0b0a8b36ebbdd732871a36ff41874dd5aa3cc4a1c6906bb4c6b8440e7e4c245d WatchSource:0}: Error finding container 0b0a8b36ebbdd732871a36ff41874dd5aa3cc4a1c6906bb4c6b8440e7e4c245d: Status 404 returned error can't find the container with id 0b0a8b36ebbdd732871a36ff41874dd5aa3cc4a1c6906bb4c6b8440e7e4c245d Mar 13 14:19:29 crc kubenswrapper[4898]: I0313 14:19:29.172343 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:19:29 crc kubenswrapper[4898]: I0313 14:19:29.751396 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f526abbc-e646-48b4-afa8-7f95f4a607a0" path="/var/lib/kubelet/pods/f526abbc-e646-48b4-afa8-7f95f4a607a0/volumes" Mar 13 14:19:29 crc kubenswrapper[4898]: I0313 14:19:29.846317 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 13 14:19:29 crc kubenswrapper[4898]: I0313 14:19:29.846363 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 13 14:19:29 crc kubenswrapper[4898]: I0313 14:19:29.852239 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerStarted","Data":"0b0a8b36ebbdd732871a36ff41874dd5aa3cc4a1c6906bb4c6b8440e7e4c245d"} Mar 13 14:19:29 crc kubenswrapper[4898]: I0313 14:19:29.937825 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 13 14:19:30 crc kubenswrapper[4898]: I0313 14:19:30.937852 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 13 14:19:31 crc kubenswrapper[4898]: I0313 14:19:31.261077 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 13 14:19:31 crc kubenswrapper[4898]: I0313 14:19:31.396401 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.423367 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gm2pz"] Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.423607 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" podUID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerName="dnsmasq-dns" containerID="cri-o://7b56acf87143f75eb70ec0471ca713a46b1d9ecd5c78e828b6dff4507ed29234" gracePeriod=10 Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.428997 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.460589 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-baf6-account-create-update-xhptm"] Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.461836 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.467163 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.472343 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-n7qmc"] Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.473751 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.488844 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-n7qmc"] Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.497945 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-baf6-account-create-update-xhptm"] Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.519038 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.583108 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81f4ee1a-c4d2-415d-9021-6503f03f8441-operator-scripts\") pod \"mysqld-exporter-baf6-account-create-update-xhptm\" (UID: \"81f4ee1a-c4d2-415d-9021-6503f03f8441\") " pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.583229 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-n7qmc\" (UID: \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\") " pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.583303 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b8hp\" (UniqueName: \"kubernetes.io/projected/81f4ee1a-c4d2-415d-9021-6503f03f8441-kube-api-access-9b8hp\") pod \"mysqld-exporter-baf6-account-create-update-xhptm\" (UID: \"81f4ee1a-c4d2-415d-9021-6503f03f8441\") " pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.583333 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qgsl\" (UniqueName: \"kubernetes.io/projected/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-kube-api-access-8qgsl\") pod \"mysqld-exporter-openstack-db-create-n7qmc\" (UID: \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\") " pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.585976 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-2sp5q"] Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.589285 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.637017 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2sp5q"] Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.684731 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-dns-svc\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.684797 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-config\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.684859 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-n7qmc\" (UID: \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\") " pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.684907 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.684938 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b8hp\" (UniqueName: \"kubernetes.io/projected/81f4ee1a-c4d2-415d-9021-6503f03f8441-kube-api-access-9b8hp\") pod \"mysqld-exporter-baf6-account-create-update-xhptm\" (UID: \"81f4ee1a-c4d2-415d-9021-6503f03f8441\") " pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.684964 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qgsl\" (UniqueName: \"kubernetes.io/projected/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-kube-api-access-8qgsl\") pod \"mysqld-exporter-openstack-db-create-n7qmc\" (UID: \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\") " pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.685011 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.685047 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81f4ee1a-c4d2-415d-9021-6503f03f8441-operator-scripts\") pod \"mysqld-exporter-baf6-account-create-update-xhptm\" (UID: \"81f4ee1a-c4d2-415d-9021-6503f03f8441\") " pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.685069 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw842\" (UniqueName: \"kubernetes.io/projected/a37db268-4fcb-45a7-a7bf-fae19a514257-kube-api-access-bw842\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.685780 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-n7qmc\" (UID: \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\") " pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.686651 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81f4ee1a-c4d2-415d-9021-6503f03f8441-operator-scripts\") pod \"mysqld-exporter-baf6-account-create-update-xhptm\" (UID: \"81f4ee1a-c4d2-415d-9021-6503f03f8441\") " pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.707605 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qgsl\" (UniqueName: \"kubernetes.io/projected/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-kube-api-access-8qgsl\") pod \"mysqld-exporter-openstack-db-create-n7qmc\" (UID: \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\") " pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.718476 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b8hp\" (UniqueName: \"kubernetes.io/projected/81f4ee1a-c4d2-415d-9021-6503f03f8441-kube-api-access-9b8hp\") pod \"mysqld-exporter-baf6-account-create-update-xhptm\" (UID: \"81f4ee1a-c4d2-415d-9021-6503f03f8441\") " pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.786533 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-dns-svc\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.786607 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-config\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.786690 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.786796 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.786844 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw842\" (UniqueName: \"kubernetes.io/projected/a37db268-4fcb-45a7-a7bf-fae19a514257-kube-api-access-bw842\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.787629 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-dns-svc\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.787644 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-config\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.787892 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.788068 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.796932 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.807704 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw842\" (UniqueName: \"kubernetes.io/projected/a37db268-4fcb-45a7-a7bf-fae19a514257-kube-api-access-bw842\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.833386 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.918837 4898 generic.go:334] "Generic (PLEG): container finished" podID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerID="7b56acf87143f75eb70ec0471ca713a46b1d9ecd5c78e828b6dff4507ed29234" exitCode=0 Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.919144 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" event={"ID":"cddd3df9-e505-4f25-988d-8cba87eaefbe","Type":"ContainerDied","Data":"7b56acf87143f75eb70ec0471ca713a46b1d9ecd5c78e828b6dff4507ed29234"} Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.959276 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:33 crc kubenswrapper[4898]: W0313 14:19:33.354499 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81f4ee1a_c4d2_415d_9021_6503f03f8441.slice/crio-fc5174c2bf5424ca7939a87e54c12b86416a2de52f15aa0500110576232c34fd WatchSource:0}: Error finding container fc5174c2bf5424ca7939a87e54c12b86416a2de52f15aa0500110576232c34fd: Status 404 returned error can't find the container with id fc5174c2bf5424ca7939a87e54c12b86416a2de52f15aa0500110576232c34fd Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.354689 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-baf6-account-create-update-xhptm"] Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.505028 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-n7qmc"] Mar 13 14:19:33 crc kubenswrapper[4898]: W0313 14:19:33.507168 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45215dff_dfeb_4b68_bc5c_d36aba0ea6b8.slice/crio-950bfa55031ccf3b1124b837e6461f7251be7748323abc050c86691a891901f7 WatchSource:0}: Error finding container 950bfa55031ccf3b1124b837e6461f7251be7748323abc050c86691a891901f7: Status 404 returned error can't find the container with id 950bfa55031ccf3b1124b837e6461f7251be7748323abc050c86691a891901f7 Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.549663 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2sp5q"] Mar 13 14:19:33 crc kubenswrapper[4898]: W0313 14:19:33.553214 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda37db268_4fcb_45a7_a7bf_fae19a514257.slice/crio-da5d3ec372689f6af3cb9e875471efd89602096089c49f7ef2acb131bc222cb5 WatchSource:0}: Error finding container da5d3ec372689f6af3cb9e875471efd89602096089c49f7ef2acb131bc222cb5: Status 404 returned error can't find the container with id da5d3ec372689f6af3cb9e875471efd89602096089c49f7ef2acb131bc222cb5 Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.564161 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.570031 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.574965 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.581223 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.581465 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-22jvh" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.591272 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.636489 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.716761 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c67178b7-226e-4a46-a7b2-f53e47faeb2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c67178b7-226e-4a46-a7b2-f53e47faeb2b\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.717015 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75f7z\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-kube-api-access-75f7z\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.717185 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/794bd82b-e289-4b31-b0cf-f1285452e783-lock\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.717451 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794bd82b-e289-4b31-b0cf-f1285452e783-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.717555 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.717738 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/794bd82b-e289-4b31-b0cf-f1285452e783-cache\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.821472 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c67178b7-226e-4a46-a7b2-f53e47faeb2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c67178b7-226e-4a46-a7b2-f53e47faeb2b\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.821527 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75f7z\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-kube-api-access-75f7z\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.821567 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/794bd82b-e289-4b31-b0cf-f1285452e783-lock\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.821623 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794bd82b-e289-4b31-b0cf-f1285452e783-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.821656 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.821701 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/794bd82b-e289-4b31-b0cf-f1285452e783-cache\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.822514 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/794bd82b-e289-4b31-b0cf-f1285452e783-lock\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: E0313 14:19:33.822956 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 14:19:33 crc kubenswrapper[4898]: E0313 14:19:33.822982 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 14:19:33 crc kubenswrapper[4898]: E0313 14:19:33.823031 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift podName:794bd82b-e289-4b31-b0cf-f1285452e783 nodeName:}" failed. No retries permitted until 2026-03-13 14:19:34.323013767 +0000 UTC m=+1409.324602076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift") pod "swift-storage-0" (UID: "794bd82b-e289-4b31-b0cf-f1285452e783") : configmap "swift-ring-files" not found Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.823545 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/794bd82b-e289-4b31-b0cf-f1285452e783-cache\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.836800 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794bd82b-e289-4b31-b0cf-f1285452e783-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.856080 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75f7z\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-kube-api-access-75f7z\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.884053 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.884103 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c67178b7-226e-4a46-a7b2-f53e47faeb2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c67178b7-226e-4a46-a7b2-f53e47faeb2b\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9562655cbd9f5053ff4fcbaf6bf6208908fded8dd99047d18c74ed262e26381a/globalmount\"" pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.937303 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2sp5q" event={"ID":"a37db268-4fcb-45a7-a7bf-fae19a514257","Type":"ContainerStarted","Data":"da5d3ec372689f6af3cb9e875471efd89602096089c49f7ef2acb131bc222cb5"} Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.938921 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" event={"ID":"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8","Type":"ContainerStarted","Data":"950bfa55031ccf3b1124b837e6461f7251be7748323abc050c86691a891901f7"} Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.940246 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" event={"ID":"81f4ee1a-c4d2-415d-9021-6503f03f8441","Type":"ContainerStarted","Data":"fc5174c2bf5424ca7939a87e54c12b86416a2de52f15aa0500110576232c34fd"} Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.941622 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerStarted","Data":"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b"} Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.959735 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c67178b7-226e-4a46-a7b2-f53e47faeb2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c67178b7-226e-4a46-a7b2-f53e47faeb2b\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.135556 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-m9wx7"] Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.137174 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.139159 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.139340 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.139635 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.168760 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-m9wx7"] Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.183097 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-ztbp9"] Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.185590 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.200534 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-m9wx7"] Mar 13 14:19:34 crc kubenswrapper[4898]: E0313 14:19:34.201461 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-q4bwg ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-m9wx7" podUID="28c184e1-ac85-4c3b-b138-3b728eb97ca3" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.212706 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-ztbp9"] Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.233617 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-ring-data-devices\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.233690 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-combined-ca-bundle\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.233975 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-scripts\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.234063 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4bwg\" (UniqueName: \"kubernetes.io/projected/28c184e1-ac85-4c3b-b138-3b728eb97ca3-kube-api-access-q4bwg\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.234187 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-swiftconf\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.234278 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-dispersionconf\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.234393 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/28c184e1-ac85-4c3b-b138-3b728eb97ca3-etc-swift\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336151 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-swiftconf\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336226 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-dispersionconf\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336276 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336294 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/28c184e1-ac85-4c3b-b138-3b728eb97ca3-etc-swift\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336329 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-scripts\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336361 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-etc-swift\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336379 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-combined-ca-bundle\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336418 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-ring-data-devices\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336442 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-combined-ca-bundle\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336467 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336495 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmzv9\" (UniqueName: \"kubernetes.io/projected/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-kube-api-access-gmzv9\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336569 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-scripts\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336596 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-ring-data-devices\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336624 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4bwg\" (UniqueName: \"kubernetes.io/projected/28c184e1-ac85-4c3b-b138-3b728eb97ca3-kube-api-access-q4bwg\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336658 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-dispersionconf\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.337135 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/28c184e1-ac85-4c3b-b138-3b728eb97ca3-etc-swift\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: E0313 14:19:34.337265 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 14:19:34 crc kubenswrapper[4898]: E0313 14:19:34.337288 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 14:19:34 crc kubenswrapper[4898]: E0313 14:19:34.337335 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift podName:794bd82b-e289-4b31-b0cf-f1285452e783 nodeName:}" failed. No retries permitted until 2026-03-13 14:19:35.337316954 +0000 UTC m=+1410.338905273 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift") pod "swift-storage-0" (UID: "794bd82b-e289-4b31-b0cf-f1285452e783") : configmap "swift-ring-files" not found Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.338098 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-scripts\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.338226 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-ring-data-devices\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.341414 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-combined-ca-bundle\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.341777 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-swiftconf\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.342070 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-dispersionconf\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.357212 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4bwg\" (UniqueName: \"kubernetes.io/projected/28c184e1-ac85-4c3b-b138-3b728eb97ca3-kube-api-access-q4bwg\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.439442 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.439510 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-scripts\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.439547 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-etc-swift\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.439570 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-combined-ca-bundle\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.439630 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmzv9\" (UniqueName: \"kubernetes.io/projected/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-kube-api-access-gmzv9\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.439695 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-ring-data-devices\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.439727 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-dispersionconf\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.442770 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-etc-swift\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.443884 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-scripts\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.444383 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-ring-data-devices\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.445713 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-dispersionconf\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.451212 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-combined-ca-bundle\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.452069 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.471672 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmzv9\" (UniqueName: \"kubernetes.io/projected/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-kube-api-access-gmzv9\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.543503 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.719166 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.759999 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.856363 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdcst\" (UniqueName: \"kubernetes.io/projected/cddd3df9-e505-4f25-988d-8cba87eaefbe-kube-api-access-cdcst\") pod \"cddd3df9-e505-4f25-988d-8cba87eaefbe\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.856649 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-config\") pod \"cddd3df9-e505-4f25-988d-8cba87eaefbe\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.856757 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-ovsdbserver-nb\") pod \"cddd3df9-e505-4f25-988d-8cba87eaefbe\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.857123 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-dns-svc\") pod \"cddd3df9-e505-4f25-988d-8cba87eaefbe\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.871303 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cddd3df9-e505-4f25-988d-8cba87eaefbe-kube-api-access-cdcst" (OuterVolumeSpecName: "kube-api-access-cdcst") pod "cddd3df9-e505-4f25-988d-8cba87eaefbe" (UID: "cddd3df9-e505-4f25-988d-8cba87eaefbe"). InnerVolumeSpecName "kube-api-access-cdcst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.912977 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cddd3df9-e505-4f25-988d-8cba87eaefbe" (UID: "cddd3df9-e505-4f25-988d-8cba87eaefbe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.932720 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-config" (OuterVolumeSpecName: "config") pod "cddd3df9-e505-4f25-988d-8cba87eaefbe" (UID: "cddd3df9-e505-4f25-988d-8cba87eaefbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.949307 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cddd3df9-e505-4f25-988d-8cba87eaefbe" (UID: "cddd3df9-e505-4f25-988d-8cba87eaefbe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.957455 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" event={"ID":"cddd3df9-e505-4f25-988d-8cba87eaefbe","Type":"ContainerDied","Data":"e6844355e00ec9e8aa5161a17239acb337d4e49d9f07d0b8c54a762df1aef8dd"} Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.957524 4898 scope.go:117] "RemoveContainer" containerID="7b56acf87143f75eb70ec0471ca713a46b1d9ecd5c78e828b6dff4507ed29234" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.957692 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.962013 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdcst\" (UniqueName: \"kubernetes.io/projected/cddd3df9-e505-4f25-988d-8cba87eaefbe-kube-api-access-cdcst\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.962097 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.962236 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.962361 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.962372 4898 generic.go:334] "Generic (PLEG): container finished" podID="a37db268-4fcb-45a7-a7bf-fae19a514257" containerID="de23e3ccb82eedd2170e1cd3b17cd796af8186141a4e3a6a0df25cf87c1ac689" exitCode=0 Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.962392 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2sp5q" event={"ID":"a37db268-4fcb-45a7-a7bf-fae19a514257","Type":"ContainerDied","Data":"de23e3ccb82eedd2170e1cd3b17cd796af8186141a4e3a6a0df25cf87c1ac689"} Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.971587 4898 generic.go:334] "Generic (PLEG): container finished" podID="45215dff-dfeb-4b68-bc5c-d36aba0ea6b8" containerID="190741a9e70699bd53ad4219ca7d5f504afce181f13fef8f81fc17d9e1a70095" exitCode=0 Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.971667 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" event={"ID":"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8","Type":"ContainerDied","Data":"190741a9e70699bd53ad4219ca7d5f504afce181f13fef8f81fc17d9e1a70095"} Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.973408 4898 generic.go:334] "Generic (PLEG): container finished" podID="81f4ee1a-c4d2-415d-9021-6503f03f8441" containerID="e6b8ff442a61a4fcf1565b308b6597fe09fb7264763f211d7540bb2a249e6c54" exitCode=0 Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.973477 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.974213 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" event={"ID":"81f4ee1a-c4d2-415d-9021-6503f03f8441","Type":"ContainerDied","Data":"e6b8ff442a61a4fcf1565b308b6597fe09fb7264763f211d7540bb2a249e6c54"} Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.082918 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.092860 4898 scope.go:117] "RemoveContainer" containerID="31ee06ecd554c7204a7ea9b6ed5158bbdc38532b41e7043447da0dc27f024036" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.131938 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-ztbp9"] Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.141413 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gm2pz"] Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.151642 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gm2pz"] Mar 13 14:19:35 crc kubenswrapper[4898]: W0313 14:19:35.154293 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a6f0bfb_5db5_440c_a93f_0d6fe159401d.slice/crio-7e54bc8fd0c558fc45ec5afd29c72815086cd68050f8ecbef97610bde93e64ce WatchSource:0}: Error finding container 7e54bc8fd0c558fc45ec5afd29c72815086cd68050f8ecbef97610bde93e64ce: Status 404 returned error can't find the container with id 7e54bc8fd0c558fc45ec5afd29c72815086cd68050f8ecbef97610bde93e64ce Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.169360 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-dispersionconf\") pod \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.169486 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-combined-ca-bundle\") pod \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.169593 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-ring-data-devices\") pod \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.169661 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-swiftconf\") pod \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.169752 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/28c184e1-ac85-4c3b-b138-3b728eb97ca3-etc-swift\") pod \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.169890 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4bwg\" (UniqueName: \"kubernetes.io/projected/28c184e1-ac85-4c3b-b138-3b728eb97ca3-kube-api-access-q4bwg\") pod \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.170022 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-scripts\") pod \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.170626 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-scripts" (OuterVolumeSpecName: "scripts") pod "28c184e1-ac85-4c3b-b138-3b728eb97ca3" (UID: "28c184e1-ac85-4c3b-b138-3b728eb97ca3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.170662 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "28c184e1-ac85-4c3b-b138-3b728eb97ca3" (UID: "28c184e1-ac85-4c3b-b138-3b728eb97ca3"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.172566 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28c184e1-ac85-4c3b-b138-3b728eb97ca3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "28c184e1-ac85-4c3b-b138-3b728eb97ca3" (UID: "28c184e1-ac85-4c3b-b138-3b728eb97ca3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.174455 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "28c184e1-ac85-4c3b-b138-3b728eb97ca3" (UID: "28c184e1-ac85-4c3b-b138-3b728eb97ca3"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.175108 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28c184e1-ac85-4c3b-b138-3b728eb97ca3-kube-api-access-q4bwg" (OuterVolumeSpecName: "kube-api-access-q4bwg") pod "28c184e1-ac85-4c3b-b138-3b728eb97ca3" (UID: "28c184e1-ac85-4c3b-b138-3b728eb97ca3"). InnerVolumeSpecName "kube-api-access-q4bwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.175576 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "28c184e1-ac85-4c3b-b138-3b728eb97ca3" (UID: "28c184e1-ac85-4c3b-b138-3b728eb97ca3"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.176289 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28c184e1-ac85-4c3b-b138-3b728eb97ca3" (UID: "28c184e1-ac85-4c3b-b138-3b728eb97ca3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.272038 4898 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.272075 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.272091 4898 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.272103 4898 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.272114 4898 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/28c184e1-ac85-4c3b-b138-3b728eb97ca3-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.272126 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4bwg\" (UniqueName: \"kubernetes.io/projected/28c184e1-ac85-4c3b-b138-3b728eb97ca3-kube-api-access-q4bwg\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.272136 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.373326 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-cdnq7"] Mar 13 14:19:35 crc kubenswrapper[4898]: E0313 14:19:35.373853 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerName="init" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.373876 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerName="init" Mar 13 14:19:35 crc kubenswrapper[4898]: E0313 14:19:35.373950 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerName="dnsmasq-dns" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.373969 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerName="dnsmasq-dns" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.374225 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerName="dnsmasq-dns" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.374302 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:35 crc kubenswrapper[4898]: E0313 14:19:35.374522 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 14:19:35 crc kubenswrapper[4898]: E0313 14:19:35.374558 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 14:19:35 crc kubenswrapper[4898]: E0313 14:19:35.374622 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift podName:794bd82b-e289-4b31-b0cf-f1285452e783 nodeName:}" failed. No retries permitted until 2026-03-13 14:19:37.374602394 +0000 UTC m=+1412.376190633 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift") pod "swift-storage-0" (UID: "794bd82b-e289-4b31-b0cf-f1285452e783") : configmap "swift-ring-files" not found Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.375131 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.383842 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-cdnq7"] Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.476077 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59bdafe7-9c43-4acc-a212-864bdf38d5b4-operator-scripts\") pod \"glance-db-create-cdnq7\" (UID: \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\") " pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.476202 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68d6j\" (UniqueName: \"kubernetes.io/projected/59bdafe7-9c43-4acc-a212-864bdf38d5b4-kube-api-access-68d6j\") pod \"glance-db-create-cdnq7\" (UID: \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\") " pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.481836 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-621e-account-create-update-dksd9"] Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.483602 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.485627 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.492283 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-621e-account-create-update-dksd9"] Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.578506 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwcdp\" (UniqueName: \"kubernetes.io/projected/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-kube-api-access-rwcdp\") pod \"glance-621e-account-create-update-dksd9\" (UID: \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\") " pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.578559 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59bdafe7-9c43-4acc-a212-864bdf38d5b4-operator-scripts\") pod \"glance-db-create-cdnq7\" (UID: \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\") " pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.578678 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68d6j\" (UniqueName: \"kubernetes.io/projected/59bdafe7-9c43-4acc-a212-864bdf38d5b4-kube-api-access-68d6j\") pod \"glance-db-create-cdnq7\" (UID: \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\") " pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.578706 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-operator-scripts\") pod \"glance-621e-account-create-update-dksd9\" (UID: \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\") " pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.579235 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59bdafe7-9c43-4acc-a212-864bdf38d5b4-operator-scripts\") pod \"glance-db-create-cdnq7\" (UID: \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\") " pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.597090 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68d6j\" (UniqueName: \"kubernetes.io/projected/59bdafe7-9c43-4acc-a212-864bdf38d5b4-kube-api-access-68d6j\") pod \"glance-db-create-cdnq7\" (UID: \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\") " pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.681019 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwcdp\" (UniqueName: \"kubernetes.io/projected/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-kube-api-access-rwcdp\") pod \"glance-621e-account-create-update-dksd9\" (UID: \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\") " pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.681178 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-operator-scripts\") pod \"glance-621e-account-create-update-dksd9\" (UID: \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\") " pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.681945 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-operator-scripts\") pod \"glance-621e-account-create-update-dksd9\" (UID: \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\") " pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.698929 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwcdp\" (UniqueName: \"kubernetes.io/projected/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-kube-api-access-rwcdp\") pod \"glance-621e-account-create-update-dksd9\" (UID: \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\") " pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.709090 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.763354 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cddd3df9-e505-4f25-988d-8cba87eaefbe" path="/var/lib/kubelet/pods/cddd3df9-e505-4f25-988d-8cba87eaefbe/volumes" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.803762 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.996224 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-ztbp9" event={"ID":"4a6f0bfb-5db5-440c-a93f-0d6fe159401d","Type":"ContainerStarted","Data":"7e54bc8fd0c558fc45ec5afd29c72815086cd68050f8ecbef97610bde93e64ce"} Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.999213 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2sp5q" event={"ID":"a37db268-4fcb-45a7-a7bf-fae19a514257","Type":"ContainerStarted","Data":"50431c87d4fda7b6d1207e4343981bd67d4f748124f89954c845f5c4fb0d25f7"} Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:35.999420 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:35.999589 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.020693 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-2sp5q" podStartSLOduration=4.020668958 podStartE2EDuration="4.020668958s" podCreationTimestamp="2026-03-13 14:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:36.014108898 +0000 UTC m=+1411.015697157" watchObservedRunningTime="2026-03-13 14:19:36.020668958 +0000 UTC m=+1411.022257197" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.079734 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-m9wx7"] Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.094919 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-m9wx7"] Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.202259 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-cdnq7"] Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.411989 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-621e-account-create-update-dksd9"] Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.794383 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.798124 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.955837 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-operator-scripts\") pod \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\" (UID: \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\") " Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.956448 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b8hp\" (UniqueName: \"kubernetes.io/projected/81f4ee1a-c4d2-415d-9021-6503f03f8441-kube-api-access-9b8hp\") pod \"81f4ee1a-c4d2-415d-9021-6503f03f8441\" (UID: \"81f4ee1a-c4d2-415d-9021-6503f03f8441\") " Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.956542 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qgsl\" (UniqueName: \"kubernetes.io/projected/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-kube-api-access-8qgsl\") pod \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\" (UID: \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\") " Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.956602 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81f4ee1a-c4d2-415d-9021-6503f03f8441-operator-scripts\") pod \"81f4ee1a-c4d2-415d-9021-6503f03f8441\" (UID: \"81f4ee1a-c4d2-415d-9021-6503f03f8441\") " Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.957509 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81f4ee1a-c4d2-415d-9021-6503f03f8441-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81f4ee1a-c4d2-415d-9021-6503f03f8441" (UID: "81f4ee1a-c4d2-415d-9021-6503f03f8441"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.957615 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45215dff-dfeb-4b68-bc5c-d36aba0ea6b8" (UID: "45215dff-dfeb-4b68-bc5c-d36aba0ea6b8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.958100 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81f4ee1a-c4d2-415d-9021-6503f03f8441-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.958125 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.962196 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f4ee1a-c4d2-415d-9021-6503f03f8441-kube-api-access-9b8hp" (OuterVolumeSpecName: "kube-api-access-9b8hp") pod "81f4ee1a-c4d2-415d-9021-6503f03f8441" (UID: "81f4ee1a-c4d2-415d-9021-6503f03f8441"). InnerVolumeSpecName "kube-api-access-9b8hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.971500 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-kube-api-access-8qgsl" (OuterVolumeSpecName: "kube-api-access-8qgsl") pod "45215dff-dfeb-4b68-bc5c-d36aba0ea6b8" (UID: "45215dff-dfeb-4b68-bc5c-d36aba0ea6b8"). InnerVolumeSpecName "kube-api-access-8qgsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.008731 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-55n8q"] Mar 13 14:19:37 crc kubenswrapper[4898]: E0313 14:19:37.009468 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45215dff-dfeb-4b68-bc5c-d36aba0ea6b8" containerName="mariadb-database-create" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.009598 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="45215dff-dfeb-4b68-bc5c-d36aba0ea6b8" containerName="mariadb-database-create" Mar 13 14:19:37 crc kubenswrapper[4898]: E0313 14:19:37.009681 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f4ee1a-c4d2-415d-9021-6503f03f8441" containerName="mariadb-account-create-update" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.009773 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f4ee1a-c4d2-415d-9021-6503f03f8441" containerName="mariadb-account-create-update" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.010107 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="45215dff-dfeb-4b68-bc5c-d36aba0ea6b8" containerName="mariadb-database-create" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.010227 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f4ee1a-c4d2-415d-9021-6503f03f8441" containerName="mariadb-account-create-update" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.011695 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.015863 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.021243 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-55n8q"] Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.028965 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.028994 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" event={"ID":"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8","Type":"ContainerDied","Data":"950bfa55031ccf3b1124b837e6461f7251be7748323abc050c86691a891901f7"} Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.032491 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="950bfa55031ccf3b1124b837e6461f7251be7748323abc050c86691a891901f7" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.040682 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" event={"ID":"81f4ee1a-c4d2-415d-9021-6503f03f8441","Type":"ContainerDied","Data":"fc5174c2bf5424ca7939a87e54c12b86416a2de52f15aa0500110576232c34fd"} Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.040731 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc5174c2bf5424ca7939a87e54c12b86416a2de52f15aa0500110576232c34fd" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.040800 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.047669 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-621e-account-create-update-dksd9" event={"ID":"a8c46fcc-fd9b-4073-99e6-28aadcdd823e","Type":"ContainerStarted","Data":"1b7f5b79b9cbc006ae8ac33cf3d709c72ff92b310ff2867c772246e7be6d5aff"} Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.047741 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-621e-account-create-update-dksd9" event={"ID":"a8c46fcc-fd9b-4073-99e6-28aadcdd823e","Type":"ContainerStarted","Data":"00edd84a4cb86280e20ee727db08d5b81851d9132863c79d667d12f0d3c65999"} Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.052082 4898 generic.go:334] "Generic (PLEG): container finished" podID="59bdafe7-9c43-4acc-a212-864bdf38d5b4" containerID="9199e9b5bfad44aa55bebdeb17820a815d88bcb2d174de10793bf2e6e2845fc2" exitCode=0 Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.052234 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cdnq7" event={"ID":"59bdafe7-9c43-4acc-a212-864bdf38d5b4","Type":"ContainerDied","Data":"9199e9b5bfad44aa55bebdeb17820a815d88bcb2d174de10793bf2e6e2845fc2"} Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.052325 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cdnq7" event={"ID":"59bdafe7-9c43-4acc-a212-864bdf38d5b4","Type":"ContainerStarted","Data":"341d71b9d52042a949c7679712659dfd1f85d596c533f795b07d24d74bb3c431"} Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.063658 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-621e-account-create-update-dksd9" podStartSLOduration=2.063631705 podStartE2EDuration="2.063631705s" podCreationTimestamp="2026-03-13 14:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:37.06306678 +0000 UTC m=+1412.064655029" watchObservedRunningTime="2026-03-13 14:19:37.063631705 +0000 UTC m=+1412.065219954" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.064476 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b8hp\" (UniqueName: \"kubernetes.io/projected/81f4ee1a-c4d2-415d-9021-6503f03f8441-kube-api-access-9b8hp\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.064509 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qgsl\" (UniqueName: \"kubernetes.io/projected/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-kube-api-access-8qgsl\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.166421 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91512eed-d544-4f70-b8ba-eda9f6b1bfef-operator-scripts\") pod \"root-account-create-update-55n8q\" (UID: \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\") " pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.166623 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8dnp\" (UniqueName: \"kubernetes.io/projected/91512eed-d544-4f70-b8ba-eda9f6b1bfef-kube-api-access-z8dnp\") pod \"root-account-create-update-55n8q\" (UID: \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\") " pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.268455 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8dnp\" (UniqueName: \"kubernetes.io/projected/91512eed-d544-4f70-b8ba-eda9f6b1bfef-kube-api-access-z8dnp\") pod \"root-account-create-update-55n8q\" (UID: \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\") " pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.268617 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91512eed-d544-4f70-b8ba-eda9f6b1bfef-operator-scripts\") pod \"root-account-create-update-55n8q\" (UID: \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\") " pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.269337 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91512eed-d544-4f70-b8ba-eda9f6b1bfef-operator-scripts\") pod \"root-account-create-update-55n8q\" (UID: \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\") " pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.285599 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8dnp\" (UniqueName: \"kubernetes.io/projected/91512eed-d544-4f70-b8ba-eda9f6b1bfef-kube-api-access-z8dnp\") pod \"root-account-create-update-55n8q\" (UID: \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\") " pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.391959 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.474356 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:37 crc kubenswrapper[4898]: E0313 14:19:37.474540 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 14:19:37 crc kubenswrapper[4898]: E0313 14:19:37.474581 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 14:19:37 crc kubenswrapper[4898]: E0313 14:19:37.474653 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift podName:794bd82b-e289-4b31-b0cf-f1285452e783 nodeName:}" failed. No retries permitted until 2026-03-13 14:19:41.474632515 +0000 UTC m=+1416.476220754 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift") pod "swift-storage-0" (UID: "794bd82b-e289-4b31-b0cf-f1285452e783") : configmap "swift-ring-files" not found Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.757426 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28c184e1-ac85-4c3b-b138-3b728eb97ca3" path="/var/lib/kubelet/pods/28c184e1-ac85-4c3b-b138-3b728eb97ca3/volumes" Mar 13 14:19:38 crc kubenswrapper[4898]: I0313 14:19:38.063727 4898 generic.go:334] "Generic (PLEG): container finished" podID="a8c46fcc-fd9b-4073-99e6-28aadcdd823e" containerID="1b7f5b79b9cbc006ae8ac33cf3d709c72ff92b310ff2867c772246e7be6d5aff" exitCode=0 Mar 13 14:19:38 crc kubenswrapper[4898]: I0313 14:19:38.063768 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-621e-account-create-update-dksd9" event={"ID":"a8c46fcc-fd9b-4073-99e6-28aadcdd823e","Type":"ContainerDied","Data":"1b7f5b79b9cbc006ae8ac33cf3d709c72ff92b310ff2867c772246e7be6d5aff"} Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.458890 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.516108 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68d6j\" (UniqueName: \"kubernetes.io/projected/59bdafe7-9c43-4acc-a212-864bdf38d5b4-kube-api-access-68d6j\") pod \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\" (UID: \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\") " Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.516242 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59bdafe7-9c43-4acc-a212-864bdf38d5b4-operator-scripts\") pod \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\" (UID: \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\") " Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.517320 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59bdafe7-9c43-4acc-a212-864bdf38d5b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59bdafe7-9c43-4acc-a212-864bdf38d5b4" (UID: "59bdafe7-9c43-4acc-a212-864bdf38d5b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.519117 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.522792 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59bdafe7-9c43-4acc-a212-864bdf38d5b4-kube-api-access-68d6j" (OuterVolumeSpecName: "kube-api-access-68d6j") pod "59bdafe7-9c43-4acc-a212-864bdf38d5b4" (UID: "59bdafe7-9c43-4acc-a212-864bdf38d5b4"). InnerVolumeSpecName "kube-api-access-68d6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.534636 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" podUID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: i/o timeout" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.621282 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwcdp\" (UniqueName: \"kubernetes.io/projected/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-kube-api-access-rwcdp\") pod \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\" (UID: \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\") " Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.621936 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-operator-scripts\") pod \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\" (UID: \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\") " Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.622418 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59bdafe7-9c43-4acc-a212-864bdf38d5b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.622438 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68d6j\" (UniqueName: \"kubernetes.io/projected/59bdafe7-9c43-4acc-a212-864bdf38d5b4-kube-api-access-68d6j\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.623227 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a8c46fcc-fd9b-4073-99e6-28aadcdd823e" (UID: "a8c46fcc-fd9b-4073-99e6-28aadcdd823e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.628070 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-kube-api-access-rwcdp" (OuterVolumeSpecName: "kube-api-access-rwcdp") pod "a8c46fcc-fd9b-4073-99e6-28aadcdd823e" (UID: "a8c46fcc-fd9b-4073-99e6-28aadcdd823e"). InnerVolumeSpecName "kube-api-access-rwcdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.724928 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.725153 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwcdp\" (UniqueName: \"kubernetes.io/projected/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-kube-api-access-rwcdp\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.846268 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-55n8q"] Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.085247 4898 generic.go:334] "Generic (PLEG): container finished" podID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerID="285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b" exitCode=0 Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.085325 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerDied","Data":"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b"} Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.090097 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cdnq7" event={"ID":"59bdafe7-9c43-4acc-a212-864bdf38d5b4","Type":"ContainerDied","Data":"341d71b9d52042a949c7679712659dfd1f85d596c533f795b07d24d74bb3c431"} Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.090132 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="341d71b9d52042a949c7679712659dfd1f85d596c533f795b07d24d74bb3c431" Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.090192 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.095469 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-ztbp9" event={"ID":"4a6f0bfb-5db5-440c-a93f-0d6fe159401d","Type":"ContainerStarted","Data":"9e25ca1915d093420431c75152fe45db09d916d8afabcd6133622b3bfdcf8934"} Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.099908 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-621e-account-create-update-dksd9" event={"ID":"a8c46fcc-fd9b-4073-99e6-28aadcdd823e","Type":"ContainerDied","Data":"00edd84a4cb86280e20ee727db08d5b81851d9132863c79d667d12f0d3c65999"} Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.099939 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00edd84a4cb86280e20ee727db08d5b81851d9132863c79d667d12f0d3c65999" Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.099982 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.103118 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-55n8q" event={"ID":"91512eed-d544-4f70-b8ba-eda9f6b1bfef","Type":"ContainerStarted","Data":"33dd2d6e0ac7d2f137fe32246deb8758bfab8a7e6e24808a6205586e1001969c"} Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.103323 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-55n8q" event={"ID":"91512eed-d544-4f70-b8ba-eda9f6b1bfef","Type":"ContainerStarted","Data":"fd546e50d07fdb189d91bb5d0791e846219e0ffc737e0423be745421015de34e"} Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.136848 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-ztbp9" podStartSLOduration=1.952848254 podStartE2EDuration="6.136830408s" podCreationTimestamp="2026-03-13 14:19:34 +0000 UTC" firstStartedPulling="2026-03-13 14:19:35.16066316 +0000 UTC m=+1410.162251399" lastFinishedPulling="2026-03-13 14:19:39.344645294 +0000 UTC m=+1414.346233553" observedRunningTime="2026-03-13 14:19:40.127920396 +0000 UTC m=+1415.129508635" watchObservedRunningTime="2026-03-13 14:19:40.136830408 +0000 UTC m=+1415.138418637" Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.146398 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-55n8q" podStartSLOduration=4.146382046 podStartE2EDuration="4.146382046s" podCreationTimestamp="2026-03-13 14:19:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:40.140218766 +0000 UTC m=+1415.141807005" watchObservedRunningTime="2026-03-13 14:19:40.146382046 +0000 UTC m=+1415.147970285" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.114813 4898 generic.go:334] "Generic (PLEG): container finished" podID="91512eed-d544-4f70-b8ba-eda9f6b1bfef" containerID="33dd2d6e0ac7d2f137fe32246deb8758bfab8a7e6e24808a6205586e1001969c" exitCode=0 Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.115981 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-55n8q" event={"ID":"91512eed-d544-4f70-b8ba-eda9f6b1bfef","Type":"ContainerDied","Data":"33dd2d6e0ac7d2f137fe32246deb8758bfab8a7e6e24808a6205586e1001969c"} Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.172795 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-ppqg7"] Mar 13 14:19:41 crc kubenswrapper[4898]: E0313 14:19:41.173628 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59bdafe7-9c43-4acc-a212-864bdf38d5b4" containerName="mariadb-database-create" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.173655 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="59bdafe7-9c43-4acc-a212-864bdf38d5b4" containerName="mariadb-database-create" Mar 13 14:19:41 crc kubenswrapper[4898]: E0313 14:19:41.173733 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c46fcc-fd9b-4073-99e6-28aadcdd823e" containerName="mariadb-account-create-update" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.173745 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c46fcc-fd9b-4073-99e6-28aadcdd823e" containerName="mariadb-account-create-update" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.174227 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="59bdafe7-9c43-4acc-a212-864bdf38d5b4" containerName="mariadb-database-create" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.174262 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c46fcc-fd9b-4073-99e6-28aadcdd823e" containerName="mariadb-account-create-update" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.175352 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.178004 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ppqg7"] Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.253973 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d2b7-account-create-update-ggzw8"] Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.255335 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.255332 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc61df36-ac68-4cf0-9456-140bccb5435c-operator-scripts\") pod \"keystone-db-create-ppqg7\" (UID: \"bc61df36-ac68-4cf0-9456-140bccb5435c\") " pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.255491 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vlfz\" (UniqueName: \"kubernetes.io/projected/bc61df36-ac68-4cf0-9456-140bccb5435c-kube-api-access-5vlfz\") pod \"keystone-db-create-ppqg7\" (UID: \"bc61df36-ac68-4cf0-9456-140bccb5435c\") " pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.258201 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.262650 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d2b7-account-create-update-ggzw8"] Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.355750 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-zzflk"] Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.357019 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vlfz\" (UniqueName: \"kubernetes.io/projected/bc61df36-ac68-4cf0-9456-140bccb5435c-kube-api-access-5vlfz\") pod \"keystone-db-create-ppqg7\" (UID: \"bc61df36-ac68-4cf0-9456-140bccb5435c\") " pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.357074 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrszf\" (UniqueName: \"kubernetes.io/projected/32a060a9-dd52-4192-bc48-b9ea7a918458-kube-api-access-wrszf\") pod \"keystone-d2b7-account-create-update-ggzw8\" (UID: \"32a060a9-dd52-4192-bc48-b9ea7a918458\") " pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.357223 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a060a9-dd52-4192-bc48-b9ea7a918458-operator-scripts\") pod \"keystone-d2b7-account-create-update-ggzw8\" (UID: \"32a060a9-dd52-4192-bc48-b9ea7a918458\") " pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.357258 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc61df36-ac68-4cf0-9456-140bccb5435c-operator-scripts\") pod \"keystone-db-create-ppqg7\" (UID: \"bc61df36-ac68-4cf0-9456-140bccb5435c\") " pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.357337 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zzflk" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.357890 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc61df36-ac68-4cf0-9456-140bccb5435c-operator-scripts\") pod \"keystone-db-create-ppqg7\" (UID: \"bc61df36-ac68-4cf0-9456-140bccb5435c\") " pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.366129 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zzflk"] Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.396451 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vlfz\" (UniqueName: \"kubernetes.io/projected/bc61df36-ac68-4cf0-9456-140bccb5435c-kube-api-access-5vlfz\") pod \"keystone-db-create-ppqg7\" (UID: \"bc61df36-ac68-4cf0-9456-140bccb5435c\") " pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.456036 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-35b4-account-create-update-7rdfs"] Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.457551 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.459306 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a060a9-dd52-4192-bc48-b9ea7a918458-operator-scripts\") pod \"keystone-d2b7-account-create-update-ggzw8\" (UID: \"32a060a9-dd52-4192-bc48-b9ea7a918458\") " pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.459402 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jv5f\" (UniqueName: \"kubernetes.io/projected/f58c984f-f43f-42dc-90a5-aebbe79a47a5-kube-api-access-9jv5f\") pod \"placement-db-create-zzflk\" (UID: \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\") " pod="openstack/placement-db-create-zzflk" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.459444 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrszf\" (UniqueName: \"kubernetes.io/projected/32a060a9-dd52-4192-bc48-b9ea7a918458-kube-api-access-wrszf\") pod \"keystone-d2b7-account-create-update-ggzw8\" (UID: \"32a060a9-dd52-4192-bc48-b9ea7a918458\") " pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.459545 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f58c984f-f43f-42dc-90a5-aebbe79a47a5-operator-scripts\") pod \"placement-db-create-zzflk\" (UID: \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\") " pod="openstack/placement-db-create-zzflk" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.460383 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a060a9-dd52-4192-bc48-b9ea7a918458-operator-scripts\") pod \"keystone-d2b7-account-create-update-ggzw8\" (UID: \"32a060a9-dd52-4192-bc48-b9ea7a918458\") " pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.461851 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.482031 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-35b4-account-create-update-7rdfs"] Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.483135 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrszf\" (UniqueName: \"kubernetes.io/projected/32a060a9-dd52-4192-bc48-b9ea7a918458-kube-api-access-wrszf\") pod \"keystone-d2b7-account-create-update-ggzw8\" (UID: \"32a060a9-dd52-4192-bc48-b9ea7a918458\") " pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.499539 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.561280 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bf799d3-e4d4-439d-b3da-d5467064f6f1-operator-scripts\") pod \"placement-35b4-account-create-update-7rdfs\" (UID: \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\") " pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.561395 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.561472 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f58c984f-f43f-42dc-90a5-aebbe79a47a5-operator-scripts\") pod \"placement-db-create-zzflk\" (UID: \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\") " pod="openstack/placement-db-create-zzflk" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.561700 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jv5f\" (UniqueName: \"kubernetes.io/projected/f58c984f-f43f-42dc-90a5-aebbe79a47a5-kube-api-access-9jv5f\") pod \"placement-db-create-zzflk\" (UID: \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\") " pod="openstack/placement-db-create-zzflk" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.561783 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksxl8\" (UniqueName: \"kubernetes.io/projected/0bf799d3-e4d4-439d-b3da-d5467064f6f1-kube-api-access-ksxl8\") pod \"placement-35b4-account-create-update-7rdfs\" (UID: \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\") " pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:41 crc kubenswrapper[4898]: E0313 14:19:41.562013 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 14:19:41 crc kubenswrapper[4898]: E0313 14:19:41.562036 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 14:19:41 crc kubenswrapper[4898]: E0313 14:19:41.562082 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift podName:794bd82b-e289-4b31-b0cf-f1285452e783 nodeName:}" failed. No retries permitted until 2026-03-13 14:19:49.562065218 +0000 UTC m=+1424.563653457 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift") pod "swift-storage-0" (UID: "794bd82b-e289-4b31-b0cf-f1285452e783") : configmap "swift-ring-files" not found Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.563083 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f58c984f-f43f-42dc-90a5-aebbe79a47a5-operator-scripts\") pod \"placement-db-create-zzflk\" (UID: \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\") " pod="openstack/placement-db-create-zzflk" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.577587 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.586418 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jv5f\" (UniqueName: \"kubernetes.io/projected/f58c984f-f43f-42dc-90a5-aebbe79a47a5-kube-api-access-9jv5f\") pod \"placement-db-create-zzflk\" (UID: \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\") " pod="openstack/placement-db-create-zzflk" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.664063 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksxl8\" (UniqueName: \"kubernetes.io/projected/0bf799d3-e4d4-439d-b3da-d5467064f6f1-kube-api-access-ksxl8\") pod \"placement-35b4-account-create-update-7rdfs\" (UID: \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\") " pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.664132 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bf799d3-e4d4-439d-b3da-d5467064f6f1-operator-scripts\") pod \"placement-35b4-account-create-update-7rdfs\" (UID: \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\") " pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.665808 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bf799d3-e4d4-439d-b3da-d5467064f6f1-operator-scripts\") pod \"placement-35b4-account-create-update-7rdfs\" (UID: \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\") " pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.683617 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zzflk" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.689924 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksxl8\" (UniqueName: \"kubernetes.io/projected/0bf799d3-e4d4-439d-b3da-d5467064f6f1-kube-api-access-ksxl8\") pod \"placement-35b4-account-create-update-7rdfs\" (UID: \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\") " pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.700091 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.036754 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ppqg7"] Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.132535 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ppqg7" event={"ID":"bc61df36-ac68-4cf0-9456-140bccb5435c","Type":"ContainerStarted","Data":"1ce06125124c6bb74ef67a1bb1a6e818c24f60300a2a80ac7fc268e205f2e9db"} Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.171447 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d2b7-account-create-update-ggzw8"] Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.295736 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zzflk"] Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.408211 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-35b4-account-create-update-7rdfs"] Mar 13 14:19:42 crc kubenswrapper[4898]: W0313 14:19:42.424935 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bf799d3_e4d4_439d_b3da_d5467064f6f1.slice/crio-25447370a3f4e28f8e8ccf8af1d7f9221a24319685b5d3199e4478e305eb1ec0 WatchSource:0}: Error finding container 25447370a3f4e28f8e8ccf8af1d7f9221a24319685b5d3199e4478e305eb1ec0: Status 404 returned error can't find the container with id 25447370a3f4e28f8e8ccf8af1d7f9221a24319685b5d3199e4478e305eb1ec0 Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.736711 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-n82b8"] Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.738873 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.750708 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.768613 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-n82b8"] Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.790336 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91512eed-d544-4f70-b8ba-eda9f6b1bfef-operator-scripts\") pod \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\" (UID: \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\") " Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.790496 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8dnp\" (UniqueName: \"kubernetes.io/projected/91512eed-d544-4f70-b8ba-eda9f6b1bfef-kube-api-access-z8dnp\") pod \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\" (UID: \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\") " Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.790911 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrtkm\" (UniqueName: \"kubernetes.io/projected/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-kube-api-access-zrtkm\") pod \"mysqld-exporter-openstack-cell1-db-create-n82b8\" (UID: \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.790961 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-n82b8\" (UID: \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.791102 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91512eed-d544-4f70-b8ba-eda9f6b1bfef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "91512eed-d544-4f70-b8ba-eda9f6b1bfef" (UID: "91512eed-d544-4f70-b8ba-eda9f6b1bfef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.802146 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91512eed-d544-4f70-b8ba-eda9f6b1bfef-kube-api-access-z8dnp" (OuterVolumeSpecName: "kube-api-access-z8dnp") pod "91512eed-d544-4f70-b8ba-eda9f6b1bfef" (UID: "91512eed-d544-4f70-b8ba-eda9f6b1bfef"). InnerVolumeSpecName "kube-api-access-z8dnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.895268 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtkm\" (UniqueName: \"kubernetes.io/projected/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-kube-api-access-zrtkm\") pod \"mysqld-exporter-openstack-cell1-db-create-n82b8\" (UID: \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.895336 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-n82b8\" (UID: \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.895457 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91512eed-d544-4f70-b8ba-eda9f6b1bfef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.895470 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8dnp\" (UniqueName: \"kubernetes.io/projected/91512eed-d544-4f70-b8ba-eda9f6b1bfef-kube-api-access-z8dnp\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.896219 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-n82b8\" (UID: \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.915793 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrtkm\" (UniqueName: \"kubernetes.io/projected/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-kube-api-access-zrtkm\") pod \"mysqld-exporter-openstack-cell1-db-create-n82b8\" (UID: \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.932348 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-ce03-account-create-update-425qr"] Mar 13 14:19:42 crc kubenswrapper[4898]: E0313 14:19:42.937495 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91512eed-d544-4f70-b8ba-eda9f6b1bfef" containerName="mariadb-account-create-update" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.937528 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="91512eed-d544-4f70-b8ba-eda9f6b1bfef" containerName="mariadb-account-create-update" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.937886 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="91512eed-d544-4f70-b8ba-eda9f6b1bfef" containerName="mariadb-account-create-update" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.941210 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.945324 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.961357 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.967844 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-ce03-account-create-update-425qr"] Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.998315 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/586ccc66-1989-46e5-98ad-b70c7e88e6bc-operator-scripts\") pod \"mysqld-exporter-ce03-account-create-update-425qr\" (UID: \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\") " pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.998497 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2zbj\" (UniqueName: \"kubernetes.io/projected/586ccc66-1989-46e5-98ad-b70c7e88e6bc-kube-api-access-h2zbj\") pod \"mysqld-exporter-ce03-account-create-update-425qr\" (UID: \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\") " pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.018759 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.028023 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9tcr"] Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.028388 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" podUID="00d79476-a8c0-4bad-81ae-6b50afea8601" containerName="dnsmasq-dns" containerID="cri-o://47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d" gracePeriod=10 Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.100961 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/586ccc66-1989-46e5-98ad-b70c7e88e6bc-operator-scripts\") pod \"mysqld-exporter-ce03-account-create-update-425qr\" (UID: \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\") " pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.103207 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2zbj\" (UniqueName: \"kubernetes.io/projected/586ccc66-1989-46e5-98ad-b70c7e88e6bc-kube-api-access-h2zbj\") pod \"mysqld-exporter-ce03-account-create-update-425qr\" (UID: \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\") " pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.104671 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/586ccc66-1989-46e5-98ad-b70c7e88e6bc-operator-scripts\") pod \"mysqld-exporter-ce03-account-create-update-425qr\" (UID: \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\") " pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.126570 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2zbj\" (UniqueName: \"kubernetes.io/projected/586ccc66-1989-46e5-98ad-b70c7e88e6bc-kube-api-access-h2zbj\") pod \"mysqld-exporter-ce03-account-create-update-425qr\" (UID: \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\") " pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.165343 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-55n8q" event={"ID":"91512eed-d544-4f70-b8ba-eda9f6b1bfef","Type":"ContainerDied","Data":"fd546e50d07fdb189d91bb5d0791e846219e0ffc737e0423be745421015de34e"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.166401 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd546e50d07fdb189d91bb5d0791e846219e0ffc737e0423be745421015de34e" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.166039 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.188016 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-35b4-account-create-update-7rdfs" event={"ID":"0bf799d3-e4d4-439d-b3da-d5467064f6f1","Type":"ContainerStarted","Data":"df7deb06b863e0990f2d81a2c25739f29fd7b5122c0d32c2314963f2204b89f3"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.188074 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-35b4-account-create-update-7rdfs" event={"ID":"0bf799d3-e4d4-439d-b3da-d5467064f6f1","Type":"ContainerStarted","Data":"25447370a3f4e28f8e8ccf8af1d7f9221a24319685b5d3199e4478e305eb1ec0"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.225062 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-35b4-account-create-update-7rdfs" podStartSLOduration=2.225041311 podStartE2EDuration="2.225041311s" podCreationTimestamp="2026-03-13 14:19:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:43.224141048 +0000 UTC m=+1418.225729287" watchObservedRunningTime="2026-03-13 14:19:43.225041311 +0000 UTC m=+1418.226629550" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.225270 4898 generic.go:334] "Generic (PLEG): container finished" podID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerID="319d11416db34d4c2bde21b35bf9b79fc6c55b22cfe14271a9be5dde11f3c078" exitCode=0 Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.225365 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ee084354-4d32-4d3c-96a4-1e4e7eef5d85","Type":"ContainerDied","Data":"319d11416db34d4c2bde21b35bf9b79fc6c55b22cfe14271a9be5dde11f3c078"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.236640 4898 generic.go:334] "Generic (PLEG): container finished" podID="bc61df36-ac68-4cf0-9456-140bccb5435c" containerID="e22a5e1114b923439d9f143c3b64b35dbca324bd3cf616ce49fd0caf5c66d873" exitCode=0 Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.236735 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ppqg7" event={"ID":"bc61df36-ac68-4cf0-9456-140bccb5435c","Type":"ContainerDied","Data":"e22a5e1114b923439d9f143c3b64b35dbca324bd3cf616ce49fd0caf5c66d873"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.244561 4898 generic.go:334] "Generic (PLEG): container finished" podID="32a060a9-dd52-4192-bc48-b9ea7a918458" containerID="592e2145b8382848d105acb5e5275b8dd688df9ab9d3d5caa34a389dd2742086" exitCode=0 Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.244676 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d2b7-account-create-update-ggzw8" event={"ID":"32a060a9-dd52-4192-bc48-b9ea7a918458","Type":"ContainerDied","Data":"592e2145b8382848d105acb5e5275b8dd688df9ab9d3d5caa34a389dd2742086"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.244880 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d2b7-account-create-update-ggzw8" event={"ID":"32a060a9-dd52-4192-bc48-b9ea7a918458","Type":"ContainerStarted","Data":"2ae64cef16b6acabaca5f7ce48033aaffd92aad88a639e6b6b3fcc38fb562bda"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.252243 4898 generic.go:334] "Generic (PLEG): container finished" podID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerID="6ac94c751f27a4d12d02923377c883f4669b7b2f835e8c6d8eb98e37f2b620ef" exitCode=0 Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.252475 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"818e3f41-30c4-4a49-b490-0d868fc2b2b8","Type":"ContainerDied","Data":"6ac94c751f27a4d12d02923377c883f4669b7b2f835e8c6d8eb98e37f2b620ef"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.288289 4898 generic.go:334] "Generic (PLEG): container finished" podID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerID="cb002d235371a7e7beebe07dd448307d31c6dae66e8fbd1dd6c0c499e634cca9" exitCode=0 Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.288364 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d56bd826-4f42-409d-ae41-9bfc70d1e038","Type":"ContainerDied","Data":"cb002d235371a7e7beebe07dd448307d31c6dae66e8fbd1dd6c0c499e634cca9"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.303179 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zzflk" event={"ID":"f58c984f-f43f-42dc-90a5-aebbe79a47a5","Type":"ContainerStarted","Data":"f7ee82c0bf1917642e2854f8aafb4c28fc338ae0b377633b5ccd89fd1a0294f4"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.303233 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zzflk" event={"ID":"f58c984f-f43f-42dc-90a5-aebbe79a47a5","Type":"ContainerStarted","Data":"639177af0c66d1dccc94daa2cc6a580d4f6472dd79a9bbc6e239f21d1c8817da"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.340424 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.538566 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-zzflk" podStartSLOduration=2.538546536 podStartE2EDuration="2.538546536s" podCreationTimestamp="2026-03-13 14:19:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:43.443617567 +0000 UTC m=+1418.445205806" watchObservedRunningTime="2026-03-13 14:19:43.538546536 +0000 UTC m=+1418.540134775" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.609456 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-55n8q"] Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.660405 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-55n8q"] Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.778248 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91512eed-d544-4f70-b8ba-eda9f6b1bfef" path="/var/lib/kubelet/pods/91512eed-d544-4f70-b8ba-eda9f6b1bfef/volumes" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.830853 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-n82b8"] Mar 13 14:19:43 crc kubenswrapper[4898]: W0313 14:19:43.896454 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba5ed93a_91b4_4942_a32c_ab02a536e3d4.slice/crio-5ca508ecb94469968a2e06dd2dc51d5ad3bb989409259ece9ea2999f68912648 WatchSource:0}: Error finding container 5ca508ecb94469968a2e06dd2dc51d5ad3bb989409259ece9ea2999f68912648: Status 404 returned error can't find the container with id 5ca508ecb94469968a2e06dd2dc51d5ad3bb989409259ece9ea2999f68912648 Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.982045 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.152863 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-nb\") pod \"00d79476-a8c0-4bad-81ae-6b50afea8601\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.152923 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-dns-svc\") pod \"00d79476-a8c0-4bad-81ae-6b50afea8601\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.153025 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-config\") pod \"00d79476-a8c0-4bad-81ae-6b50afea8601\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.153210 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-sb\") pod \"00d79476-a8c0-4bad-81ae-6b50afea8601\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.153243 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dz9f\" (UniqueName: \"kubernetes.io/projected/00d79476-a8c0-4bad-81ae-6b50afea8601-kube-api-access-9dz9f\") pod \"00d79476-a8c0-4bad-81ae-6b50afea8601\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.159472 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00d79476-a8c0-4bad-81ae-6b50afea8601-kube-api-access-9dz9f" (OuterVolumeSpecName: "kube-api-access-9dz9f") pod "00d79476-a8c0-4bad-81ae-6b50afea8601" (UID: "00d79476-a8c0-4bad-81ae-6b50afea8601"). InnerVolumeSpecName "kube-api-access-9dz9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.228790 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "00d79476-a8c0-4bad-81ae-6b50afea8601" (UID: "00d79476-a8c0-4bad-81ae-6b50afea8601"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.234319 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "00d79476-a8c0-4bad-81ae-6b50afea8601" (UID: "00d79476-a8c0-4bad-81ae-6b50afea8601"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.242747 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-config" (OuterVolumeSpecName: "config") pod "00d79476-a8c0-4bad-81ae-6b50afea8601" (UID: "00d79476-a8c0-4bad-81ae-6b50afea8601"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.243863 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00d79476-a8c0-4bad-81ae-6b50afea8601" (UID: "00d79476-a8c0-4bad-81ae-6b50afea8601"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.255123 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.255157 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.255169 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dz9f\" (UniqueName: \"kubernetes.io/projected/00d79476-a8c0-4bad-81ae-6b50afea8601-kube-api-access-9dz9f\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.255178 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.255187 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.262014 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-ce03-account-create-update-425qr"] Mar 13 14:19:44 crc kubenswrapper[4898]: W0313 14:19:44.268198 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod586ccc66_1989_46e5_98ad_b70c7e88e6bc.slice/crio-b38fcfa140b37179564ee614932bc1713c8394aa057a914d2d0914aa1268a307 WatchSource:0}: Error finding container b38fcfa140b37179564ee614932bc1713c8394aa057a914d2d0914aa1268a307: Status 404 returned error can't find the container with id b38fcfa140b37179564ee614932bc1713c8394aa057a914d2d0914aa1268a307 Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.316594 4898 generic.go:334] "Generic (PLEG): container finished" podID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerID="8e18090ad1757c0b15ba6a519121358ec8fea5c9816c6426d3dd165832b431af" exitCode=0 Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.316666 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b","Type":"ContainerDied","Data":"8e18090ad1757c0b15ba6a519121358ec8fea5c9816c6426d3dd165832b431af"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.332763 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" event={"ID":"586ccc66-1989-46e5-98ad-b70c7e88e6bc","Type":"ContainerStarted","Data":"b38fcfa140b37179564ee614932bc1713c8394aa057a914d2d0914aa1268a307"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.336017 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" event={"ID":"ba5ed93a-91b4-4942-a32c-ab02a536e3d4","Type":"ContainerStarted","Data":"b9cd470c8d031dbf9ec18b998e1bb21c765853e9617c7315b08f8356dad9258f"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.336065 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" event={"ID":"ba5ed93a-91b4-4942-a32c-ab02a536e3d4","Type":"ContainerStarted","Data":"5ca508ecb94469968a2e06dd2dc51d5ad3bb989409259ece9ea2999f68912648"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.349419 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ee084354-4d32-4d3c-96a4-1e4e7eef5d85","Type":"ContainerStarted","Data":"fdd228971531e06c4cfdc0dd4d0052c10c0646d03035ab33629bba605b7a9d8b"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.350278 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.352955 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d56bd826-4f42-409d-ae41-9bfc70d1e038","Type":"ContainerStarted","Data":"d377b62f42012aae1789077dde2b4c09f8f770f73f941f01fe11eb21f5b88378"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.353611 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.357822 4898 generic.go:334] "Generic (PLEG): container finished" podID="f58c984f-f43f-42dc-90a5-aebbe79a47a5" containerID="f7ee82c0bf1917642e2854f8aafb4c28fc338ae0b377633b5ccd89fd1a0294f4" exitCode=0 Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.357876 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zzflk" event={"ID":"f58c984f-f43f-42dc-90a5-aebbe79a47a5","Type":"ContainerDied","Data":"f7ee82c0bf1917642e2854f8aafb4c28fc338ae0b377633b5ccd89fd1a0294f4"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.372688 4898 generic.go:334] "Generic (PLEG): container finished" podID="00d79476-a8c0-4bad-81ae-6b50afea8601" containerID="47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d" exitCode=0 Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.372798 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" event={"ID":"00d79476-a8c0-4bad-81ae-6b50afea8601","Type":"ContainerDied","Data":"47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.372830 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" event={"ID":"00d79476-a8c0-4bad-81ae-6b50afea8601","Type":"ContainerDied","Data":"feb144c4ab58e7daafcb4fa68f1d3e330823c649be75d07c3fb9eb354883c85d"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.372846 4898 scope.go:117] "RemoveContainer" containerID="47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.373005 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.389410 4898 generic.go:334] "Generic (PLEG): container finished" podID="0bf799d3-e4d4-439d-b3da-d5467064f6f1" containerID="df7deb06b863e0990f2d81a2c25739f29fd7b5122c0d32c2314963f2204b89f3" exitCode=0 Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.389493 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-35b4-account-create-update-7rdfs" event={"ID":"0bf799d3-e4d4-439d-b3da-d5467064f6f1","Type":"ContainerDied","Data":"df7deb06b863e0990f2d81a2c25739f29fd7b5122c0d32c2314963f2204b89f3"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.410358 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" podStartSLOduration=2.410333421 podStartE2EDuration="2.410333421s" podCreationTimestamp="2026-03-13 14:19:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:44.404311484 +0000 UTC m=+1419.405899723" watchObservedRunningTime="2026-03-13 14:19:44.410333421 +0000 UTC m=+1419.411921660" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.421081 4898 scope.go:117] "RemoveContainer" containerID="994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.437340 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"818e3f41-30c4-4a49-b490-0d868fc2b2b8","Type":"ContainerStarted","Data":"122b0ca68068adfd3963153faa26d42ce9ae7ae836229a17a2096dab37be0af4"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.437533 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.476488 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.112626882 podStartE2EDuration="59.476465541s" podCreationTimestamp="2026-03-13 14:18:45 +0000 UTC" firstStartedPulling="2026-03-13 14:18:52.691946578 +0000 UTC m=+1367.693534817" lastFinishedPulling="2026-03-13 14:19:09.055785237 +0000 UTC m=+1384.057373476" observedRunningTime="2026-03-13 14:19:44.461447971 +0000 UTC m=+1419.463036220" watchObservedRunningTime="2026-03-13 14:19:44.476465541 +0000 UTC m=+1419.478053780" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.523502 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=43.269996896 podStartE2EDuration="59.523484694s" podCreationTimestamp="2026-03-13 14:18:45 +0000 UTC" firstStartedPulling="2026-03-13 14:18:52.691355543 +0000 UTC m=+1367.692943782" lastFinishedPulling="2026-03-13 14:19:08.944843331 +0000 UTC m=+1383.946431580" observedRunningTime="2026-03-13 14:19:44.512434177 +0000 UTC m=+1419.514022426" watchObservedRunningTime="2026-03-13 14:19:44.523484694 +0000 UTC m=+1419.525072933" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.525987 4898 scope.go:117] "RemoveContainer" containerID="47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d" Mar 13 14:19:44 crc kubenswrapper[4898]: E0313 14:19:44.527129 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d\": container with ID starting with 47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d not found: ID does not exist" containerID="47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.527162 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d"} err="failed to get container status \"47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d\": rpc error: code = NotFound desc = could not find container \"47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d\": container with ID starting with 47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d not found: ID does not exist" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.527186 4898 scope.go:117] "RemoveContainer" containerID="994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be" Mar 13 14:19:44 crc kubenswrapper[4898]: E0313 14:19:44.528059 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be\": container with ID starting with 994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be not found: ID does not exist" containerID="994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.528091 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be"} err="failed to get container status \"994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be\": rpc error: code = NotFound desc = could not find container \"994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be\": container with ID starting with 994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be not found: ID does not exist" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.662338 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9tcr"] Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.706640 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9tcr"] Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.712060 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=43.431047135 podStartE2EDuration="59.712032508s" podCreationTimestamp="2026-03-13 14:18:45 +0000 UTC" firstStartedPulling="2026-03-13 14:18:52.743451858 +0000 UTC m=+1367.745040097" lastFinishedPulling="2026-03-13 14:19:09.024437231 +0000 UTC m=+1384.026025470" observedRunningTime="2026-03-13 14:19:44.622735036 +0000 UTC m=+1419.624323295" watchObservedRunningTime="2026-03-13 14:19:44.712032508 +0000 UTC m=+1419.713620757" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.896547 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.129714 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.146465 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.206125 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc61df36-ac68-4cf0-9456-140bccb5435c-operator-scripts\") pod \"bc61df36-ac68-4cf0-9456-140bccb5435c\" (UID: \"bc61df36-ac68-4cf0-9456-140bccb5435c\") " Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.206478 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrszf\" (UniqueName: \"kubernetes.io/projected/32a060a9-dd52-4192-bc48-b9ea7a918458-kube-api-access-wrszf\") pod \"32a060a9-dd52-4192-bc48-b9ea7a918458\" (UID: \"32a060a9-dd52-4192-bc48-b9ea7a918458\") " Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.206512 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a060a9-dd52-4192-bc48-b9ea7a918458-operator-scripts\") pod \"32a060a9-dd52-4192-bc48-b9ea7a918458\" (UID: \"32a060a9-dd52-4192-bc48-b9ea7a918458\") " Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.206551 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vlfz\" (UniqueName: \"kubernetes.io/projected/bc61df36-ac68-4cf0-9456-140bccb5435c-kube-api-access-5vlfz\") pod \"bc61df36-ac68-4cf0-9456-140bccb5435c\" (UID: \"bc61df36-ac68-4cf0-9456-140bccb5435c\") " Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.206863 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc61df36-ac68-4cf0-9456-140bccb5435c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc61df36-ac68-4cf0-9456-140bccb5435c" (UID: "bc61df36-ac68-4cf0-9456-140bccb5435c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.207211 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a060a9-dd52-4192-bc48-b9ea7a918458-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32a060a9-dd52-4192-bc48-b9ea7a918458" (UID: "32a060a9-dd52-4192-bc48-b9ea7a918458"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.207324 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc61df36-ac68-4cf0-9456-140bccb5435c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.213716 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a060a9-dd52-4192-bc48-b9ea7a918458-kube-api-access-wrszf" (OuterVolumeSpecName: "kube-api-access-wrszf") pod "32a060a9-dd52-4192-bc48-b9ea7a918458" (UID: "32a060a9-dd52-4192-bc48-b9ea7a918458"). InnerVolumeSpecName "kube-api-access-wrszf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.216308 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc61df36-ac68-4cf0-9456-140bccb5435c-kube-api-access-5vlfz" (OuterVolumeSpecName: "kube-api-access-5vlfz") pod "bc61df36-ac68-4cf0-9456-140bccb5435c" (UID: "bc61df36-ac68-4cf0-9456-140bccb5435c"). InnerVolumeSpecName "kube-api-access-5vlfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.309799 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrszf\" (UniqueName: \"kubernetes.io/projected/32a060a9-dd52-4192-bc48-b9ea7a918458-kube-api-access-wrszf\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.309834 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a060a9-dd52-4192-bc48-b9ea7a918458-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.309844 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vlfz\" (UniqueName: \"kubernetes.io/projected/bc61df36-ac68-4cf0-9456-140bccb5435c-kube-api-access-5vlfz\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.448949 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ppqg7" event={"ID":"bc61df36-ac68-4cf0-9456-140bccb5435c","Type":"ContainerDied","Data":"1ce06125124c6bb74ef67a1bb1a6e818c24f60300a2a80ac7fc268e205f2e9db"} Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.448988 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ce06125124c6bb74ef67a1bb1a6e818c24f60300a2a80ac7fc268e205f2e9db" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.449036 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.454348 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.454369 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d2b7-account-create-update-ggzw8" event={"ID":"32a060a9-dd52-4192-bc48-b9ea7a918458","Type":"ContainerDied","Data":"2ae64cef16b6acabaca5f7ce48033aaffd92aad88a639e6b6b3fcc38fb562bda"} Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.454406 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ae64cef16b6acabaca5f7ce48033aaffd92aad88a639e6b6b3fcc38fb562bda" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.458438 4898 generic.go:334] "Generic (PLEG): container finished" podID="586ccc66-1989-46e5-98ad-b70c7e88e6bc" containerID="66d662834083b3a8826084dc54618cf384de1f9336d6d06012f43689e8e15545" exitCode=0 Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.458492 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" event={"ID":"586ccc66-1989-46e5-98ad-b70c7e88e6bc","Type":"ContainerDied","Data":"66d662834083b3a8826084dc54618cf384de1f9336d6d06012f43689e8e15545"} Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.464540 4898 generic.go:334] "Generic (PLEG): container finished" podID="ba5ed93a-91b4-4942-a32c-ab02a536e3d4" containerID="b9cd470c8d031dbf9ec18b998e1bb21c765853e9617c7315b08f8356dad9258f" exitCode=0 Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.464644 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" event={"ID":"ba5ed93a-91b4-4942-a32c-ab02a536e3d4","Type":"ContainerDied","Data":"b9cd470c8d031dbf9ec18b998e1bb21c765853e9617c7315b08f8356dad9258f"} Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.471000 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b","Type":"ContainerStarted","Data":"1483af6a8a31cc8d457a629a4f59e975d8c4af4c9fe53636f469b1c72aabc655"} Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.555737 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=44.194072891 podStartE2EDuration="1m0.555718863s" podCreationTimestamp="2026-03-13 14:18:45 +0000 UTC" firstStartedPulling="2026-03-13 14:18:52.67049546 +0000 UTC m=+1367.672083699" lastFinishedPulling="2026-03-13 14:19:09.032141432 +0000 UTC m=+1384.033729671" observedRunningTime="2026-03-13 14:19:45.544414459 +0000 UTC m=+1420.546002698" watchObservedRunningTime="2026-03-13 14:19:45.555718863 +0000 UTC m=+1420.557307102" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.777722 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00d79476-a8c0-4bad-81ae-6b50afea8601" path="/var/lib/kubelet/pods/00d79476-a8c0-4bad-81ae-6b50afea8601/volumes" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.790476 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-x75zk"] Mar 13 14:19:45 crc kubenswrapper[4898]: E0313 14:19:45.790977 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a060a9-dd52-4192-bc48-b9ea7a918458" containerName="mariadb-account-create-update" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.790993 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a060a9-dd52-4192-bc48-b9ea7a918458" containerName="mariadb-account-create-update" Mar 13 14:19:45 crc kubenswrapper[4898]: E0313 14:19:45.791002 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc61df36-ac68-4cf0-9456-140bccb5435c" containerName="mariadb-database-create" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.791008 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc61df36-ac68-4cf0-9456-140bccb5435c" containerName="mariadb-database-create" Mar 13 14:19:45 crc kubenswrapper[4898]: E0313 14:19:45.791027 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d79476-a8c0-4bad-81ae-6b50afea8601" containerName="dnsmasq-dns" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.791033 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d79476-a8c0-4bad-81ae-6b50afea8601" containerName="dnsmasq-dns" Mar 13 14:19:45 crc kubenswrapper[4898]: E0313 14:19:45.791049 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d79476-a8c0-4bad-81ae-6b50afea8601" containerName="init" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.791055 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d79476-a8c0-4bad-81ae-6b50afea8601" containerName="init" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.791254 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc61df36-ac68-4cf0-9456-140bccb5435c" containerName="mariadb-database-create" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.791273 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="00d79476-a8c0-4bad-81ae-6b50afea8601" containerName="dnsmasq-dns" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.791284 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a060a9-dd52-4192-bc48-b9ea7a918458" containerName="mariadb-account-create-update" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.791995 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.795226 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.795414 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-t4f8x" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.806441 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-x75zk"] Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.927138 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-db-sync-config-data\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.927218 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-config-data\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.927351 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-combined-ca-bundle\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.927410 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rvbp\" (UniqueName: \"kubernetes.io/projected/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-kube-api-access-8rvbp\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.029116 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-combined-ca-bundle\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.029470 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rvbp\" (UniqueName: \"kubernetes.io/projected/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-kube-api-access-8rvbp\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.029601 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-db-sync-config-data\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.029631 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-config-data\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.034483 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-config-data\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.036047 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-combined-ca-bundle\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.046510 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-db-sync-config-data\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.160586 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rvbp\" (UniqueName: \"kubernetes.io/projected/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-kube-api-access-8rvbp\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.279598 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.412531 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.438081 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksxl8\" (UniqueName: \"kubernetes.io/projected/0bf799d3-e4d4-439d-b3da-d5467064f6f1-kube-api-access-ksxl8\") pod \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\" (UID: \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\") " Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.438332 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bf799d3-e4d4-439d-b3da-d5467064f6f1-operator-scripts\") pod \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\" (UID: \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\") " Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.438927 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf799d3-e4d4-439d-b3da-d5467064f6f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0bf799d3-e4d4-439d-b3da-d5467064f6f1" (UID: "0bf799d3-e4d4-439d-b3da-d5467064f6f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.442432 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bf799d3-e4d4-439d-b3da-d5467064f6f1-kube-api-access-ksxl8" (OuterVolumeSpecName: "kube-api-access-ksxl8") pod "0bf799d3-e4d4-439d-b3da-d5467064f6f1" (UID: "0bf799d3-e4d4-439d-b3da-d5467064f6f1"). InnerVolumeSpecName "kube-api-access-ksxl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.499557 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.499752 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-35b4-account-create-update-7rdfs" event={"ID":"0bf799d3-e4d4-439d-b3da-d5467064f6f1","Type":"ContainerDied","Data":"25447370a3f4e28f8e8ccf8af1d7f9221a24319685b5d3199e4478e305eb1ec0"} Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.500065 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25447370a3f4e28f8e8ccf8af1d7f9221a24319685b5d3199e4478e305eb1ec0" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.540270 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bf799d3-e4d4-439d-b3da-d5467064f6f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.540494 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksxl8\" (UniqueName: \"kubernetes.io/projected/0bf799d3-e4d4-439d-b3da-d5467064f6f1-kube-api-access-ksxl8\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.708379 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zzflk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.853820 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f58c984f-f43f-42dc-90a5-aebbe79a47a5-operator-scripts\") pod \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\" (UID: \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\") " Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.854322 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f58c984f-f43f-42dc-90a5-aebbe79a47a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f58c984f-f43f-42dc-90a5-aebbe79a47a5" (UID: "f58c984f-f43f-42dc-90a5-aebbe79a47a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.854467 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jv5f\" (UniqueName: \"kubernetes.io/projected/f58c984f-f43f-42dc-90a5-aebbe79a47a5-kube-api-access-9jv5f\") pod \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\" (UID: \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\") " Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.856939 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f58c984f-f43f-42dc-90a5-aebbe79a47a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.858562 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f58c984f-f43f-42dc-90a5-aebbe79a47a5-kube-api-access-9jv5f" (OuterVolumeSpecName: "kube-api-access-9jv5f") pod "f58c984f-f43f-42dc-90a5-aebbe79a47a5" (UID: "f58c984f-f43f-42dc-90a5-aebbe79a47a5"). InnerVolumeSpecName "kube-api-access-9jv5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.959429 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jv5f\" (UniqueName: \"kubernetes.io/projected/f58c984f-f43f-42dc-90a5-aebbe79a47a5-kube-api-access-9jv5f\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.063748 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-kwhth"] Mar 13 14:19:47 crc kubenswrapper[4898]: E0313 14:19:47.064435 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf799d3-e4d4-439d-b3da-d5467064f6f1" containerName="mariadb-account-create-update" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.064448 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf799d3-e4d4-439d-b3da-d5467064f6f1" containerName="mariadb-account-create-update" Mar 13 14:19:47 crc kubenswrapper[4898]: E0313 14:19:47.064460 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f58c984f-f43f-42dc-90a5-aebbe79a47a5" containerName="mariadb-database-create" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.064465 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f58c984f-f43f-42dc-90a5-aebbe79a47a5" containerName="mariadb-database-create" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.064659 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bf799d3-e4d4-439d-b3da-d5467064f6f1" containerName="mariadb-account-create-update" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.064680 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f58c984f-f43f-42dc-90a5-aebbe79a47a5" containerName="mariadb-database-create" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.065396 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.070271 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.084409 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kwhth"] Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.101111 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.122624 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.129496 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.165307 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2tg7\" (UniqueName: \"kubernetes.io/projected/f555bcf8-c516-44eb-aaf3-446734ea39c2-kube-api-access-q2tg7\") pod \"root-account-create-update-kwhth\" (UID: \"f555bcf8-c516-44eb-aaf3-446734ea39c2\") " pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.165353 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f555bcf8-c516-44eb-aaf3-446734ea39c2-operator-scripts\") pod \"root-account-create-update-kwhth\" (UID: \"f555bcf8-c516-44eb-aaf3-446734ea39c2\") " pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.266981 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2zbj\" (UniqueName: \"kubernetes.io/projected/586ccc66-1989-46e5-98ad-b70c7e88e6bc-kube-api-access-h2zbj\") pod \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\" (UID: \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\") " Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.267032 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrtkm\" (UniqueName: \"kubernetes.io/projected/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-kube-api-access-zrtkm\") pod \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\" (UID: \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\") " Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.267173 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-operator-scripts\") pod \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\" (UID: \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\") " Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.267290 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/586ccc66-1989-46e5-98ad-b70c7e88e6bc-operator-scripts\") pod \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\" (UID: \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\") " Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.268009 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/586ccc66-1989-46e5-98ad-b70c7e88e6bc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "586ccc66-1989-46e5-98ad-b70c7e88e6bc" (UID: "586ccc66-1989-46e5-98ad-b70c7e88e6bc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.268080 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba5ed93a-91b4-4942-a32c-ab02a536e3d4" (UID: "ba5ed93a-91b4-4942-a32c-ab02a536e3d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.268477 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2tg7\" (UniqueName: \"kubernetes.io/projected/f555bcf8-c516-44eb-aaf3-446734ea39c2-kube-api-access-q2tg7\") pod \"root-account-create-update-kwhth\" (UID: \"f555bcf8-c516-44eb-aaf3-446734ea39c2\") " pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.268881 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f555bcf8-c516-44eb-aaf3-446734ea39c2-operator-scripts\") pod \"root-account-create-update-kwhth\" (UID: \"f555bcf8-c516-44eb-aaf3-446734ea39c2\") " pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.269403 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f555bcf8-c516-44eb-aaf3-446734ea39c2-operator-scripts\") pod \"root-account-create-update-kwhth\" (UID: \"f555bcf8-c516-44eb-aaf3-446734ea39c2\") " pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.269418 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.269821 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/586ccc66-1989-46e5-98ad-b70c7e88e6bc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.270368 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/586ccc66-1989-46e5-98ad-b70c7e88e6bc-kube-api-access-h2zbj" (OuterVolumeSpecName: "kube-api-access-h2zbj") pod "586ccc66-1989-46e5-98ad-b70c7e88e6bc" (UID: "586ccc66-1989-46e5-98ad-b70c7e88e6bc"). InnerVolumeSpecName "kube-api-access-h2zbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.272219 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-kube-api-access-zrtkm" (OuterVolumeSpecName: "kube-api-access-zrtkm") pod "ba5ed93a-91b4-4942-a32c-ab02a536e3d4" (UID: "ba5ed93a-91b4-4942-a32c-ab02a536e3d4"). InnerVolumeSpecName "kube-api-access-zrtkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.287747 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2tg7\" (UniqueName: \"kubernetes.io/projected/f555bcf8-c516-44eb-aaf3-446734ea39c2-kube-api-access-q2tg7\") pod \"root-account-create-update-kwhth\" (UID: \"f555bcf8-c516-44eb-aaf3-446734ea39c2\") " pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.371365 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2zbj\" (UniqueName: \"kubernetes.io/projected/586ccc66-1989-46e5-98ad-b70c7e88e6bc-kube-api-access-h2zbj\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.371398 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrtkm\" (UniqueName: \"kubernetes.io/projected/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-kube-api-access-zrtkm\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.397554 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.517928 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-x75zk"] Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.522272 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zzflk" event={"ID":"f58c984f-f43f-42dc-90a5-aebbe79a47a5","Type":"ContainerDied","Data":"639177af0c66d1dccc94daa2cc6a580d4f6472dd79a9bbc6e239f21d1c8817da"} Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.522318 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zzflk" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.522334 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="639177af0c66d1dccc94daa2cc6a580d4f6472dd79a9bbc6e239f21d1c8817da" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.529373 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" event={"ID":"586ccc66-1989-46e5-98ad-b70c7e88e6bc","Type":"ContainerDied","Data":"b38fcfa140b37179564ee614932bc1713c8394aa057a914d2d0914aa1268a307"} Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.529407 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b38fcfa140b37179564ee614932bc1713c8394aa057a914d2d0914aa1268a307" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.529520 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.542542 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" event={"ID":"ba5ed93a-91b4-4942-a32c-ab02a536e3d4","Type":"ContainerDied","Data":"5ca508ecb94469968a2e06dd2dc51d5ad3bb989409259ece9ea2999f68912648"} Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.542650 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ca508ecb94469968a2e06dd2dc51d5ad3bb989409259ece9ea2999f68912648" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.542770 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.959819 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kwhth"] Mar 13 14:19:48 crc kubenswrapper[4898]: I0313 14:19:48.557995 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kwhth" event={"ID":"f555bcf8-c516-44eb-aaf3-446734ea39c2","Type":"ContainerStarted","Data":"ae2833d21b214266cf912f562e6ce0013ae2d27327887e2c914fb19e35f69b2e"} Mar 13 14:19:48 crc kubenswrapper[4898]: I0313 14:19:48.558344 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kwhth" event={"ID":"f555bcf8-c516-44eb-aaf3-446734ea39c2","Type":"ContainerStarted","Data":"44806f086dd8f53a481f443ac6d3d5090de65c2b1eec8674ee6ce1b1966c8da0"} Mar 13 14:19:48 crc kubenswrapper[4898]: I0313 14:19:48.563535 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-x75zk" event={"ID":"74d4aeca-15ec-4f63-87e0-20daa6f3e70f","Type":"ContainerStarted","Data":"cfd74a4d5e0c74c0810797d2427e3c0958d75483bdf99da9d6e8e1c6bac07623"} Mar 13 14:19:48 crc kubenswrapper[4898]: I0313 14:19:48.581983 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-kwhth" podStartSLOduration=1.581965044 podStartE2EDuration="1.581965044s" podCreationTimestamp="2026-03-13 14:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:48.57410522 +0000 UTC m=+1423.575693459" watchObservedRunningTime="2026-03-13 14:19:48.581965044 +0000 UTC m=+1423.583553283" Mar 13 14:19:48 crc kubenswrapper[4898]: I0313 14:19:48.926849 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6ddbb5776b-mx8sz" podUID="8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" containerName="console" containerID="cri-o://5fefdf0ffc03648b76b936160e7ddb5fac4056b104b304ca9509ff6217a2c4fc" gracePeriod=15 Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.619136 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:49 crc kubenswrapper[4898]: E0313 14:19:49.619578 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 14:19:49 crc kubenswrapper[4898]: E0313 14:19:49.619592 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 14:19:49 crc kubenswrapper[4898]: E0313 14:19:49.619638 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift podName:794bd82b-e289-4b31-b0cf-f1285452e783 nodeName:}" failed. No retries permitted until 2026-03-13 14:20:05.619621624 +0000 UTC m=+1440.621209863 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift") pod "swift-storage-0" (UID: "794bd82b-e289-4b31-b0cf-f1285452e783") : configmap "swift-ring-files" not found Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.677972 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6ddbb5776b-mx8sz_8f6fd2de-efa6-4d17-aa5e-4f44ced1f822/console/0.log" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.678042 4898 generic.go:334] "Generic (PLEG): container finished" podID="8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" containerID="5fefdf0ffc03648b76b936160e7ddb5fac4056b104b304ca9509ff6217a2c4fc" exitCode=2 Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.678460 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ddbb5776b-mx8sz" event={"ID":"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822","Type":"ContainerDied","Data":"5fefdf0ffc03648b76b936160e7ddb5fac4056b104b304ca9509ff6217a2c4fc"} Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.770339 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6ddbb5776b-mx8sz_8f6fd2de-efa6-4d17-aa5e-4f44ced1f822/console/0.log" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.770443 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.928701 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-trusted-ca-bundle\") pod \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.928849 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98jnr\" (UniqueName: \"kubernetes.io/projected/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-kube-api-access-98jnr\") pod \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.928872 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-service-ca\") pod \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.928917 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-oauth-config\") pod \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.928961 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-config\") pod \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.928994 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-serving-cert\") pod \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.929467 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" (UID: "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.929526 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-service-ca" (OuterVolumeSpecName: "service-ca") pod "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" (UID: "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.929839 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-config" (OuterVolumeSpecName: "console-config") pod "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" (UID: "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.929992 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-oauth-serving-cert\") pod \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.930424 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" (UID: "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.930510 4898 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.930534 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.930603 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.930615 4898 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.934989 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-kube-api-access-98jnr" (OuterVolumeSpecName: "kube-api-access-98jnr") pod "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" (UID: "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822"). InnerVolumeSpecName "kube-api-access-98jnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.935321 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" (UID: "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.948070 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" (UID: "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.032193 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98jnr\" (UniqueName: \"kubernetes.io/projected/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-kube-api-access-98jnr\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.032234 4898 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.032248 4898 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.233627 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-j79bj" podUID="a506ef1a-354a-49c8-b63d-4db4b9ecdcfe" containerName="ovn-controller" probeResult="failure" output=< Mar 13 14:19:50 crc kubenswrapper[4898]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 14:19:50 crc kubenswrapper[4898]: > Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.313303 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.691716 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6ddbb5776b-mx8sz_8f6fd2de-efa6-4d17-aa5e-4f44ced1f822/console/0.log" Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.691852 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ddbb5776b-mx8sz" event={"ID":"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822","Type":"ContainerDied","Data":"25f0f90d86df62bf31d201c4dfd7dca1ef0e3998bd4fd076756d1a9449231afa"} Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.691922 4898 scope.go:117] "RemoveContainer" containerID="5fefdf0ffc03648b76b936160e7ddb5fac4056b104b304ca9509ff6217a2c4fc" Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.692129 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.694494 4898 generic.go:334] "Generic (PLEG): container finished" podID="4a6f0bfb-5db5-440c-a93f-0d6fe159401d" containerID="9e25ca1915d093420431c75152fe45db09d916d8afabcd6133622b3bfdcf8934" exitCode=0 Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.694552 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-ztbp9" event={"ID":"4a6f0bfb-5db5-440c-a93f-0d6fe159401d","Type":"ContainerDied","Data":"9e25ca1915d093420431c75152fe45db09d916d8afabcd6133622b3bfdcf8934"} Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.695838 4898 generic.go:334] "Generic (PLEG): container finished" podID="f555bcf8-c516-44eb-aaf3-446734ea39c2" containerID="ae2833d21b214266cf912f562e6ce0013ae2d27327887e2c914fb19e35f69b2e" exitCode=0 Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.695871 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kwhth" event={"ID":"f555bcf8-c516-44eb-aaf3-446734ea39c2","Type":"ContainerDied","Data":"ae2833d21b214266cf912f562e6ce0013ae2d27327887e2c914fb19e35f69b2e"} Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.771418 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6ddbb5776b-mx8sz"] Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.785809 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6ddbb5776b-mx8sz"] Mar 13 14:19:51 crc kubenswrapper[4898]: I0313 14:19:51.755420 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" path="/var/lib/kubelet/pods/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822/volumes" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.046048 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:19:53 crc kubenswrapper[4898]: E0313 14:19:53.046779 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586ccc66-1989-46e5-98ad-b70c7e88e6bc" containerName="mariadb-account-create-update" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.046798 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="586ccc66-1989-46e5-98ad-b70c7e88e6bc" containerName="mariadb-account-create-update" Mar 13 14:19:53 crc kubenswrapper[4898]: E0313 14:19:53.046840 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" containerName="console" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.046848 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" containerName="console" Mar 13 14:19:53 crc kubenswrapper[4898]: E0313 14:19:53.046862 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba5ed93a-91b4-4942-a32c-ab02a536e3d4" containerName="mariadb-database-create" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.046871 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba5ed93a-91b4-4942-a32c-ab02a536e3d4" containerName="mariadb-database-create" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.047364 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" containerName="console" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.047387 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba5ed93a-91b4-4942-a32c-ab02a536e3d4" containerName="mariadb-database-create" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.047416 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="586ccc66-1989-46e5-98ad-b70c7e88e6bc" containerName="mariadb-account-create-update" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.048138 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.050233 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.067090 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.103480 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.103581 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8775h\" (UniqueName: \"kubernetes.io/projected/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-kube-api-access-8775h\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.103715 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-config-data\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.206009 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8775h\" (UniqueName: \"kubernetes.io/projected/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-kube-api-access-8775h\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.206163 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-config-data\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.206237 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.214930 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.215368 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-config-data\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.224219 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8775h\" (UniqueName: \"kubernetes.io/projected/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-kube-api-access-8775h\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.371052 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.212158 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-j79bj" podUID="a506ef1a-354a-49c8-b63d-4db4b9ecdcfe" containerName="ovn-controller" probeResult="failure" output=< Mar 13 14:19:55 crc kubenswrapper[4898]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 14:19:55 crc kubenswrapper[4898]: > Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.269052 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.476118 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-j79bj-config-2f2rl"] Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.477695 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.479528 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.493851 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j79bj-config-2f2rl"] Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.556190 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62mk8\" (UniqueName: \"kubernetes.io/projected/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-kube-api-access-62mk8\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.556339 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-additional-scripts\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.556495 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.556530 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-log-ovn\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.556614 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run-ovn\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.556687 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-scripts\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.658571 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62mk8\" (UniqueName: \"kubernetes.io/projected/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-kube-api-access-62mk8\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.658623 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-additional-scripts\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.658684 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.658703 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-log-ovn\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.658738 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run-ovn\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.658767 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-scripts\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.659345 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.660168 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-additional-scripts\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.660252 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-log-ovn\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.660303 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run-ovn\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.660623 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-scripts\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.696636 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62mk8\" (UniqueName: \"kubernetes.io/projected/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-kube-api-access-62mk8\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.802070 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.008009 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.014913 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.065689 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmzv9\" (UniqueName: \"kubernetes.io/projected/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-kube-api-access-gmzv9\") pod \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.066220 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f555bcf8-c516-44eb-aaf3-446734ea39c2-operator-scripts\") pod \"f555bcf8-c516-44eb-aaf3-446734ea39c2\" (UID: \"f555bcf8-c516-44eb-aaf3-446734ea39c2\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.066254 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf\") pod \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.066282 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2tg7\" (UniqueName: \"kubernetes.io/projected/f555bcf8-c516-44eb-aaf3-446734ea39c2-kube-api-access-q2tg7\") pod \"f555bcf8-c516-44eb-aaf3-446734ea39c2\" (UID: \"f555bcf8-c516-44eb-aaf3-446734ea39c2\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.066317 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-etc-swift\") pod \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.066362 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-dispersionconf\") pod \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.066440 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-scripts\") pod \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.066475 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-combined-ca-bundle\") pod \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.066516 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-ring-data-devices\") pod \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.067681 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4a6f0bfb-5db5-440c-a93f-0d6fe159401d" (UID: "4a6f0bfb-5db5-440c-a93f-0d6fe159401d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.068059 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f555bcf8-c516-44eb-aaf3-446734ea39c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f555bcf8-c516-44eb-aaf3-446734ea39c2" (UID: "f555bcf8-c516-44eb-aaf3-446734ea39c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.077595 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4a6f0bfb-5db5-440c-a93f-0d6fe159401d" (UID: "4a6f0bfb-5db5-440c-a93f-0d6fe159401d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.077761 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-kube-api-access-gmzv9" (OuterVolumeSpecName: "kube-api-access-gmzv9") pod "4a6f0bfb-5db5-440c-a93f-0d6fe159401d" (UID: "4a6f0bfb-5db5-440c-a93f-0d6fe159401d"). InnerVolumeSpecName "kube-api-access-gmzv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.082126 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f555bcf8-c516-44eb-aaf3-446734ea39c2-kube-api-access-q2tg7" (OuterVolumeSpecName: "kube-api-access-q2tg7") pod "f555bcf8-c516-44eb-aaf3-446734ea39c2" (UID: "f555bcf8-c516-44eb-aaf3-446734ea39c2"). InnerVolumeSpecName "kube-api-access-q2tg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.088031 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4a6f0bfb-5db5-440c-a93f-0d6fe159401d" (UID: "4a6f0bfb-5db5-440c-a93f-0d6fe159401d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.099829 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-scripts" (OuterVolumeSpecName: "scripts") pod "4a6f0bfb-5db5-440c-a93f-0d6fe159401d" (UID: "4a6f0bfb-5db5-440c-a93f-0d6fe159401d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: E0313 14:19:56.120056 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf podName:4a6f0bfb-5db5-440c-a93f-0d6fe159401d nodeName:}" failed. No retries permitted until 2026-03-13 14:19:56.620026378 +0000 UTC m=+1431.621614617 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "swiftconf" (UniqueName: "kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf") pod "4a6f0bfb-5db5-440c-a93f-0d6fe159401d" (UID: "4a6f0bfb-5db5-440c-a93f-0d6fe159401d") : error deleting /var/lib/kubelet/pods/4a6f0bfb-5db5-440c-a93f-0d6fe159401d/volume-subpaths: remove /var/lib/kubelet/pods/4a6f0bfb-5db5-440c-a93f-0d6fe159401d/volume-subpaths: no such file or directory Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.122247 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a6f0bfb-5db5-440c-a93f-0d6fe159401d" (UID: "4a6f0bfb-5db5-440c-a93f-0d6fe159401d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.170878 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.170921 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.170934 4898 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.170945 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmzv9\" (UniqueName: \"kubernetes.io/projected/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-kube-api-access-gmzv9\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.170956 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f555bcf8-c516-44eb-aaf3-446734ea39c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.170965 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2tg7\" (UniqueName: \"kubernetes.io/projected/f555bcf8-c516-44eb-aaf3-446734ea39c2-kube-api-access-q2tg7\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.170973 4898 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.170981 4898 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.680004 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf\") pod \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.685165 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4a6f0bfb-5db5-440c-a93f-0d6fe159401d" (UID: "4a6f0bfb-5db5-440c-a93f-0d6fe159401d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.784996 4898 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.792097 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.792100 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kwhth" event={"ID":"f555bcf8-c516-44eb-aaf3-446734ea39c2","Type":"ContainerDied","Data":"44806f086dd8f53a481f443ac6d3d5090de65c2b1eec8674ee6ce1b1966c8da0"} Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.792234 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44806f086dd8f53a481f443ac6d3d5090de65c2b1eec8674ee6ce1b1966c8da0" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.793821 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-ztbp9" event={"ID":"4a6f0bfb-5db5-440c-a93f-0d6fe159401d","Type":"ContainerDied","Data":"7e54bc8fd0c558fc45ec5afd29c72815086cd68050f8ecbef97610bde93e64ce"} Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.793858 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e54bc8fd0c558fc45ec5afd29c72815086cd68050f8ecbef97610bde93e64ce" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.793883 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.893497 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 13 14:19:57 crc kubenswrapper[4898]: I0313 14:19:57.133308 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 13 14:19:57 crc kubenswrapper[4898]: I0313 14:19:57.213147 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 13 14:19:57 crc kubenswrapper[4898]: I0313 14:19:57.274038 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Mar 13 14:19:58 crc kubenswrapper[4898]: I0313 14:19:58.603350 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-kwhth"] Mar 13 14:19:58 crc kubenswrapper[4898]: I0313 14:19:58.615328 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-kwhth"] Mar 13 14:19:59 crc kubenswrapper[4898]: I0313 14:19:59.755505 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f555bcf8-c516-44eb-aaf3-446734ea39c2" path="/var/lib/kubelet/pods/f555bcf8-c516-44eb-aaf3-446734ea39c2/volumes" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.139415 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556860-xbwgj"] Mar 13 14:20:00 crc kubenswrapper[4898]: E0313 14:20:00.140448 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f555bcf8-c516-44eb-aaf3-446734ea39c2" containerName="mariadb-account-create-update" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.140467 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f555bcf8-c516-44eb-aaf3-446734ea39c2" containerName="mariadb-account-create-update" Mar 13 14:20:00 crc kubenswrapper[4898]: E0313 14:20:00.140501 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6f0bfb-5db5-440c-a93f-0d6fe159401d" containerName="swift-ring-rebalance" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.140508 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6f0bfb-5db5-440c-a93f-0d6fe159401d" containerName="swift-ring-rebalance" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.140719 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6f0bfb-5db5-440c-a93f-0d6fe159401d" containerName="swift-ring-rebalance" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.140750 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f555bcf8-c516-44eb-aaf3-446734ea39c2" containerName="mariadb-account-create-update" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.141472 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.145381 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.145519 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.146531 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.155690 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556860-xbwgj"] Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.208056 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-j79bj" podUID="a506ef1a-354a-49c8-b63d-4db4b9ecdcfe" containerName="ovn-controller" probeResult="failure" output=< Mar 13 14:20:00 crc kubenswrapper[4898]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 14:20:00 crc kubenswrapper[4898]: > Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.302038 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh72f\" (UniqueName: \"kubernetes.io/projected/02521dff-1dee-4839-ab35-a4bfa82bc405-kube-api-access-wh72f\") pod \"auto-csr-approver-29556860-xbwgj\" (UID: \"02521dff-1dee-4839-ab35-a4bfa82bc405\") " pod="openshift-infra/auto-csr-approver-29556860-xbwgj" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.404856 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh72f\" (UniqueName: \"kubernetes.io/projected/02521dff-1dee-4839-ab35-a4bfa82bc405-kube-api-access-wh72f\") pod \"auto-csr-approver-29556860-xbwgj\" (UID: \"02521dff-1dee-4839-ab35-a4bfa82bc405\") " pod="openshift-infra/auto-csr-approver-29556860-xbwgj" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.423288 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh72f\" (UniqueName: \"kubernetes.io/projected/02521dff-1dee-4839-ab35-a4bfa82bc405-kube-api-access-wh72f\") pod \"auto-csr-approver-29556860-xbwgj\" (UID: \"02521dff-1dee-4839-ab35-a4bfa82bc405\") " pod="openshift-infra/auto-csr-approver-29556860-xbwgj" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.470096 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.063287 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-vswd4"] Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.065575 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.068265 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.074360 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vswd4"] Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.144919 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq9fz\" (UniqueName: \"kubernetes.io/projected/26c5ac36-8689-4bf3-8755-a84e22377e2a-kube-api-access-lq9fz\") pod \"root-account-create-update-vswd4\" (UID: \"26c5ac36-8689-4bf3-8755-a84e22377e2a\") " pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.145246 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26c5ac36-8689-4bf3-8755-a84e22377e2a-operator-scripts\") pod \"root-account-create-update-vswd4\" (UID: \"26c5ac36-8689-4bf3-8755-a84e22377e2a\") " pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.247589 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq9fz\" (UniqueName: \"kubernetes.io/projected/26c5ac36-8689-4bf3-8755-a84e22377e2a-kube-api-access-lq9fz\") pod \"root-account-create-update-vswd4\" (UID: \"26c5ac36-8689-4bf3-8755-a84e22377e2a\") " pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.247862 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26c5ac36-8689-4bf3-8755-a84e22377e2a-operator-scripts\") pod \"root-account-create-update-vswd4\" (UID: \"26c5ac36-8689-4bf3-8755-a84e22377e2a\") " pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.248810 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26c5ac36-8689-4bf3-8755-a84e22377e2a-operator-scripts\") pod \"root-account-create-update-vswd4\" (UID: \"26c5ac36-8689-4bf3-8755-a84e22377e2a\") " pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.267945 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq9fz\" (UniqueName: \"kubernetes.io/projected/26c5ac36-8689-4bf3-8755-a84e22377e2a-kube-api-access-lq9fz\") pod \"root-account-create-update-vswd4\" (UID: \"26c5ac36-8689-4bf3-8755-a84e22377e2a\") " pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.394175 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.355344 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.391876 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j79bj-config-2f2rl"] Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.559528 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556860-xbwgj"] Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.569180 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vswd4"] Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.888243 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"8c27f029-bffd-4f8f-bb24-c1c9c245d38c","Type":"ContainerStarted","Data":"fe553b08f29dc87c01c836389227794f6bc900596f5a85dd1ed792d64aa19876"} Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.892213 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j79bj-config-2f2rl" event={"ID":"c5ddf723-16ad-425f-bae9-86adc8fe2a3d","Type":"ContainerStarted","Data":"41174a57eeced7f987ec1ea67f79e818399b7ca23c9e404468462af2e7f7d393"} Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.892253 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j79bj-config-2f2rl" event={"ID":"c5ddf723-16ad-425f-bae9-86adc8fe2a3d","Type":"ContainerStarted","Data":"2969707a1c4df4f3788f85982e6db252de3869c756e5d786fc65dcca1a81e648"} Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.895021 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vswd4" event={"ID":"26c5ac36-8689-4bf3-8755-a84e22377e2a","Type":"ContainerStarted","Data":"c91cc1f40aaa9775749654fff4fe79567271db256673b0ded8ef7bcbbed0be52"} Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.895070 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vswd4" event={"ID":"26c5ac36-8689-4bf3-8755-a84e22377e2a","Type":"ContainerStarted","Data":"5d888e90db0c648dc7f94b37e9185e80b994aa04738c4f7716d99c355585bcd1"} Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.898416 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-x75zk" event={"ID":"74d4aeca-15ec-4f63-87e0-20daa6f3e70f","Type":"ContainerStarted","Data":"81635cb2b5569daf242649d4672547e48f7dd06f334c45c3f33ff7ef44a8ec5f"} Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.900383 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" event={"ID":"02521dff-1dee-4839-ab35-a4bfa82bc405","Type":"ContainerStarted","Data":"704384ed7c0f9c137486ec08c213a83e3fe7f2792127e2bd89444beb085393db"} Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.903407 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerStarted","Data":"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d"} Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.928097 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-j79bj-config-2f2rl" podStartSLOduration=8.928068519 podStartE2EDuration="8.928068519s" podCreationTimestamp="2026-03-13 14:19:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:03.911760996 +0000 UTC m=+1438.913349235" watchObservedRunningTime="2026-03-13 14:20:03.928068519 +0000 UTC m=+1438.929656778" Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.941007 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-x75zk" podStartSLOduration=3.541535182 podStartE2EDuration="18.940989505s" podCreationTimestamp="2026-03-13 14:19:45 +0000 UTC" firstStartedPulling="2026-03-13 14:19:47.537808636 +0000 UTC m=+1422.539396875" lastFinishedPulling="2026-03-13 14:20:02.937262949 +0000 UTC m=+1437.938851198" observedRunningTime="2026-03-13 14:20:03.930334778 +0000 UTC m=+1438.931923017" watchObservedRunningTime="2026-03-13 14:20:03.940989505 +0000 UTC m=+1438.942577734" Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.972153 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-vswd4" podStartSLOduration=1.972128803 podStartE2EDuration="1.972128803s" podCreationTimestamp="2026-03-13 14:20:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:03.957652078 +0000 UTC m=+1438.959240317" watchObservedRunningTime="2026-03-13 14:20:03.972128803 +0000 UTC m=+1438.973717042" Mar 13 14:20:04 crc kubenswrapper[4898]: I0313 14:20:04.917922 4898 generic.go:334] "Generic (PLEG): container finished" podID="26c5ac36-8689-4bf3-8755-a84e22377e2a" containerID="c91cc1f40aaa9775749654fff4fe79567271db256673b0ded8ef7bcbbed0be52" exitCode=0 Mar 13 14:20:04 crc kubenswrapper[4898]: I0313 14:20:04.918504 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vswd4" event={"ID":"26c5ac36-8689-4bf3-8755-a84e22377e2a","Type":"ContainerDied","Data":"c91cc1f40aaa9775749654fff4fe79567271db256673b0ded8ef7bcbbed0be52"} Mar 13 14:20:04 crc kubenswrapper[4898]: I0313 14:20:04.926667 4898 generic.go:334] "Generic (PLEG): container finished" podID="c5ddf723-16ad-425f-bae9-86adc8fe2a3d" containerID="41174a57eeced7f987ec1ea67f79e818399b7ca23c9e404468462af2e7f7d393" exitCode=0 Mar 13 14:20:04 crc kubenswrapper[4898]: I0313 14:20:04.926730 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j79bj-config-2f2rl" event={"ID":"c5ddf723-16ad-425f-bae9-86adc8fe2a3d","Type":"ContainerDied","Data":"41174a57eeced7f987ec1ea67f79e818399b7ca23c9e404468462af2e7f7d393"} Mar 13 14:20:05 crc kubenswrapper[4898]: I0313 14:20:05.349357 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-j79bj" Mar 13 14:20:05 crc kubenswrapper[4898]: I0313 14:20:05.664480 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:20:05 crc kubenswrapper[4898]: I0313 14:20:05.675046 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:20:05 crc kubenswrapper[4898]: I0313 14:20:05.912820 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 14:20:05 crc kubenswrapper[4898]: I0313 14:20:05.940301 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"8c27f029-bffd-4f8f-bb24-c1c9c245d38c","Type":"ContainerStarted","Data":"169bac7a2a87f00863ce79b3524b65b62b7c3fd09f57a0d5d216a587ead2fc00"} Mar 13 14:20:05 crc kubenswrapper[4898]: I0313 14:20:05.943363 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" event={"ID":"02521dff-1dee-4839-ab35-a4bfa82bc405","Type":"ContainerStarted","Data":"36c5ec42fae468ab48602c979ddc403d899fdca66a51026d30aea73028cc2339"} Mar 13 14:20:05 crc kubenswrapper[4898]: I0313 14:20:05.983703 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=11.395220488 podStartE2EDuration="12.983678442s" podCreationTimestamp="2026-03-13 14:19:53 +0000 UTC" firstStartedPulling="2026-03-13 14:20:03.365004653 +0000 UTC m=+1438.366592892" lastFinishedPulling="2026-03-13 14:20:04.953462607 +0000 UTC m=+1439.955050846" observedRunningTime="2026-03-13 14:20:05.961367472 +0000 UTC m=+1440.962955731" watchObservedRunningTime="2026-03-13 14:20:05.983678442 +0000 UTC m=+1440.985266701" Mar 13 14:20:05 crc kubenswrapper[4898]: I0313 14:20:05.993382 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" podStartSLOduration=4.46352042 podStartE2EDuration="5.993365863s" podCreationTimestamp="2026-03-13 14:20:00 +0000 UTC" firstStartedPulling="2026-03-13 14:20:03.611018239 +0000 UTC m=+1438.612606478" lastFinishedPulling="2026-03-13 14:20:05.140863682 +0000 UTC m=+1440.142451921" observedRunningTime="2026-03-13 14:20:05.983598549 +0000 UTC m=+1440.985186788" watchObservedRunningTime="2026-03-13 14:20:05.993365863 +0000 UTC m=+1440.994954102" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.529201 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.535622 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.687865 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26c5ac36-8689-4bf3-8755-a84e22377e2a-operator-scripts\") pod \"26c5ac36-8689-4bf3-8755-a84e22377e2a\" (UID: \"26c5ac36-8689-4bf3-8755-a84e22377e2a\") " Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.687965 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run\") pod \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.688140 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-additional-scripts\") pod \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.688175 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq9fz\" (UniqueName: \"kubernetes.io/projected/26c5ac36-8689-4bf3-8755-a84e22377e2a-kube-api-access-lq9fz\") pod \"26c5ac36-8689-4bf3-8755-a84e22377e2a\" (UID: \"26c5ac36-8689-4bf3-8755-a84e22377e2a\") " Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.688234 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-scripts\") pod \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.688267 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-log-ovn\") pod \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.688329 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run-ovn\") pod \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.688434 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62mk8\" (UniqueName: \"kubernetes.io/projected/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-kube-api-access-62mk8\") pod \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.689060 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run" (OuterVolumeSpecName: "var-run") pod "c5ddf723-16ad-425f-bae9-86adc8fe2a3d" (UID: "c5ddf723-16ad-425f-bae9-86adc8fe2a3d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.689123 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c5ddf723-16ad-425f-bae9-86adc8fe2a3d" (UID: "c5ddf723-16ad-425f-bae9-86adc8fe2a3d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.689190 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c5ddf723-16ad-425f-bae9-86adc8fe2a3d" (UID: "c5ddf723-16ad-425f-bae9-86adc8fe2a3d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.689624 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26c5ac36-8689-4bf3-8755-a84e22377e2a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "26c5ac36-8689-4bf3-8755-a84e22377e2a" (UID: "26c5ac36-8689-4bf3-8755-a84e22377e2a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.689844 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c5ddf723-16ad-425f-bae9-86adc8fe2a3d" (UID: "c5ddf723-16ad-425f-bae9-86adc8fe2a3d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.690078 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-scripts" (OuterVolumeSpecName: "scripts") pod "c5ddf723-16ad-425f-bae9-86adc8fe2a3d" (UID: "c5ddf723-16ad-425f-bae9-86adc8fe2a3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.693486 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.696272 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c5ac36-8689-4bf3-8755-a84e22377e2a-kube-api-access-lq9fz" (OuterVolumeSpecName: "kube-api-access-lq9fz") pod "26c5ac36-8689-4bf3-8755-a84e22377e2a" (UID: "26c5ac36-8689-4bf3-8755-a84e22377e2a"). InnerVolumeSpecName "kube-api-access-lq9fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.698170 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-kube-api-access-62mk8" (OuterVolumeSpecName: "kube-api-access-62mk8") pod "c5ddf723-16ad-425f-bae9-86adc8fe2a3d" (UID: "c5ddf723-16ad-425f-bae9-86adc8fe2a3d"). InnerVolumeSpecName "kube-api-access-62mk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.790392 4898 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.790431 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq9fz\" (UniqueName: \"kubernetes.io/projected/26c5ac36-8689-4bf3-8755-a84e22377e2a-kube-api-access-lq9fz\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.790444 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.790453 4898 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.790461 4898 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.790476 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62mk8\" (UniqueName: \"kubernetes.io/projected/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-kube-api-access-62mk8\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.790485 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26c5ac36-8689-4bf3-8755-a84e22377e2a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.790494 4898 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.894374 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.986015 4898 generic.go:334] "Generic (PLEG): container finished" podID="02521dff-1dee-4839-ab35-a4bfa82bc405" containerID="36c5ec42fae468ab48602c979ddc403d899fdca66a51026d30aea73028cc2339" exitCode=0 Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.986112 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" event={"ID":"02521dff-1dee-4839-ab35-a4bfa82bc405","Type":"ContainerDied","Data":"36c5ec42fae468ab48602c979ddc403d899fdca66a51026d30aea73028cc2339"} Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.989287 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerStarted","Data":"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c"} Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.993371 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j79bj-config-2f2rl" event={"ID":"c5ddf723-16ad-425f-bae9-86adc8fe2a3d","Type":"ContainerDied","Data":"2969707a1c4df4f3788f85982e6db252de3869c756e5d786fc65dcca1a81e648"} Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.993413 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2969707a1c4df4f3788f85982e6db252de3869c756e5d786fc65dcca1a81e648" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.993493 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.995851 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"61fd48329f75beb01938657f89ddfb5c6fc8974bf8ca503f4cd35506daaa8d03"} Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.003984 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.005580 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vswd4" event={"ID":"26c5ac36-8689-4bf3-8755-a84e22377e2a","Type":"ContainerDied","Data":"5d888e90db0c648dc7f94b37e9185e80b994aa04738c4f7716d99c355585bcd1"} Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.005665 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d888e90db0c648dc7f94b37e9185e80b994aa04738c4f7716d99c355585bcd1" Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.129848 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.214138 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.272311 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.628072 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-j79bj-config-2f2rl"] Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.648096 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-j79bj-config-2f2rl"] Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.753945 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5ddf723-16ad-425f-bae9-86adc8fe2a3d" path="/var/lib/kubelet/pods/c5ddf723-16ad-425f-bae9-86adc8fe2a3d/volumes" Mar 13 14:20:08 crc kubenswrapper[4898]: I0313 14:20:08.412993 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" Mar 13 14:20:08 crc kubenswrapper[4898]: I0313 14:20:08.531006 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh72f\" (UniqueName: \"kubernetes.io/projected/02521dff-1dee-4839-ab35-a4bfa82bc405-kube-api-access-wh72f\") pod \"02521dff-1dee-4839-ab35-a4bfa82bc405\" (UID: \"02521dff-1dee-4839-ab35-a4bfa82bc405\") " Mar 13 14:20:08 crc kubenswrapper[4898]: I0313 14:20:08.537999 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02521dff-1dee-4839-ab35-a4bfa82bc405-kube-api-access-wh72f" (OuterVolumeSpecName: "kube-api-access-wh72f") pod "02521dff-1dee-4839-ab35-a4bfa82bc405" (UID: "02521dff-1dee-4839-ab35-a4bfa82bc405"). InnerVolumeSpecName "kube-api-access-wh72f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:08 crc kubenswrapper[4898]: I0313 14:20:08.636182 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh72f\" (UniqueName: \"kubernetes.io/projected/02521dff-1dee-4839-ab35-a4bfa82bc405-kube-api-access-wh72f\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:08 crc kubenswrapper[4898]: I0313 14:20:08.665945 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-vswd4"] Mar 13 14:20:08 crc kubenswrapper[4898]: I0313 14:20:08.674440 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-vswd4"] Mar 13 14:20:08 crc kubenswrapper[4898]: I0313 14:20:08.872233 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556854-6mxhz"] Mar 13 14:20:08 crc kubenswrapper[4898]: I0313 14:20:08.883005 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556854-6mxhz"] Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.035466 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"85fbdd6a017089c207ebb2fda009421eabef1062d20993ddf2ad0ac0b4237787"} Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.035507 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"07cc49f90a9c9b623e87b89a0b1f6c9600328ee32955d8414c03933f8ef12ee2"} Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.035519 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"74a02d2191ed8fbd85f8c87a59e58cac67e744563a2564af19540e247a0af73c"} Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.035528 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"aa180d09f36c6285291090ea50ab34d0371d807ce5e52914f90991a1fb1de6a1"} Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.037745 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" event={"ID":"02521dff-1dee-4839-ab35-a4bfa82bc405","Type":"ContainerDied","Data":"704384ed7c0f9c137486ec08c213a83e3fe7f2792127e2bd89444beb085393db"} Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.037780 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="704384ed7c0f9c137486ec08c213a83e3fe7f2792127e2bd89444beb085393db" Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.037832 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.751843 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26c5ac36-8689-4bf3-8755-a84e22377e2a" path="/var/lib/kubelet/pods/26c5ac36-8689-4bf3-8755-a84e22377e2a/volumes" Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.752766 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35372caa-772c-434c-8fb2-3b82926c1521" path="/var/lib/kubelet/pods/35372caa-772c-434c-8fb2-3b82926c1521/volumes" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.076273 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerStarted","Data":"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278"} Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.084684 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"1d1f86d9f0bfde7af6c1c4523635ef06639e513d20e8518878792757a9dd2b0a"} Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.084768 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"bf3060e295b94781eff853dbb64a2279fcac419c2e558e1c4859725f8c17f921"} Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.084785 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"7243cd0755637497d2fd05db7578e90dec5fef84329e018ced2d69d7fc21df87"} Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.084795 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"ddb42c8c65fca96b5642211c30c422d299ebd8e9b93dc178ff16cefd9f8c664c"} Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.093300 4898 generic.go:334] "Generic (PLEG): container finished" podID="74d4aeca-15ec-4f63-87e0-20daa6f3e70f" containerID="81635cb2b5569daf242649d4672547e48f7dd06f334c45c3f33ff7ef44a8ec5f" exitCode=0 Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.093345 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-x75zk" event={"ID":"74d4aeca-15ec-4f63-87e0-20daa6f3e70f","Type":"ContainerDied","Data":"81635cb2b5569daf242649d4672547e48f7dd06f334c45c3f33ff7ef44a8ec5f"} Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.104933 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-85xgb"] Mar 13 14:20:12 crc kubenswrapper[4898]: E0313 14:20:12.105434 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02521dff-1dee-4839-ab35-a4bfa82bc405" containerName="oc" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.105461 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="02521dff-1dee-4839-ab35-a4bfa82bc405" containerName="oc" Mar 13 14:20:12 crc kubenswrapper[4898]: E0313 14:20:12.105481 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ddf723-16ad-425f-bae9-86adc8fe2a3d" containerName="ovn-config" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.105491 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ddf723-16ad-425f-bae9-86adc8fe2a3d" containerName="ovn-config" Mar 13 14:20:12 crc kubenswrapper[4898]: E0313 14:20:12.105512 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c5ac36-8689-4bf3-8755-a84e22377e2a" containerName="mariadb-account-create-update" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.105519 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c5ac36-8689-4bf3-8755-a84e22377e2a" containerName="mariadb-account-create-update" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.105762 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c5ac36-8689-4bf3-8755-a84e22377e2a" containerName="mariadb-account-create-update" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.105801 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5ddf723-16ad-425f-bae9-86adc8fe2a3d" containerName="ovn-config" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.105816 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="02521dff-1dee-4839-ab35-a4bfa82bc405" containerName="oc" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.106740 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.108470 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.126835 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=13.186634207 podStartE2EDuration="44.126817874s" podCreationTimestamp="2026-03-13 14:19:28 +0000 UTC" firstStartedPulling="2026-03-13 14:19:40.08731106 +0000 UTC m=+1415.088899299" lastFinishedPulling="2026-03-13 14:20:11.027494727 +0000 UTC m=+1446.029082966" observedRunningTime="2026-03-13 14:20:12.110946912 +0000 UTC m=+1447.112535161" watchObservedRunningTime="2026-03-13 14:20:12.126817874 +0000 UTC m=+1447.128406113" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.127987 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-85xgb"] Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.262294 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-operator-scripts\") pod \"root-account-create-update-85xgb\" (UID: \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\") " pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.262412 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkbcr\" (UniqueName: \"kubernetes.io/projected/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-kube-api-access-xkbcr\") pod \"root-account-create-update-85xgb\" (UID: \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\") " pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.365223 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-operator-scripts\") pod \"root-account-create-update-85xgb\" (UID: \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\") " pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.365418 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkbcr\" (UniqueName: \"kubernetes.io/projected/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-kube-api-access-xkbcr\") pod \"root-account-create-update-85xgb\" (UID: \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\") " pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.366665 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-operator-scripts\") pod \"root-account-create-update-85xgb\" (UID: \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\") " pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.392153 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkbcr\" (UniqueName: \"kubernetes.io/projected/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-kube-api-access-xkbcr\") pod \"root-account-create-update-85xgb\" (UID: \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\") " pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.432927 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.957142 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-85xgb"] Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.529536 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-x75zk" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.583200 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.583249 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.588153 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.597834 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-combined-ca-bundle\") pod \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.598031 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rvbp\" (UniqueName: \"kubernetes.io/projected/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-kube-api-access-8rvbp\") pod \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.598092 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-db-sync-config-data\") pod \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.598351 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-config-data\") pod \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.606102 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-kube-api-access-8rvbp" (OuterVolumeSpecName: "kube-api-access-8rvbp") pod "74d4aeca-15ec-4f63-87e0-20daa6f3e70f" (UID: "74d4aeca-15ec-4f63-87e0-20daa6f3e70f"). InnerVolumeSpecName "kube-api-access-8rvbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.607266 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "74d4aeca-15ec-4f63-87e0-20daa6f3e70f" (UID: "74d4aeca-15ec-4f63-87e0-20daa6f3e70f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.642961 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74d4aeca-15ec-4f63-87e0-20daa6f3e70f" (UID: "74d4aeca-15ec-4f63-87e0-20daa6f3e70f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.668228 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-config-data" (OuterVolumeSpecName: "config-data") pod "74d4aeca-15ec-4f63-87e0-20daa6f3e70f" (UID: "74d4aeca-15ec-4f63-87e0-20daa6f3e70f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.703062 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.703093 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.703104 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rvbp\" (UniqueName: \"kubernetes.io/projected/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-kube-api-access-8rvbp\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.703114 4898 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.134084 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-x75zk" event={"ID":"74d4aeca-15ec-4f63-87e0-20daa6f3e70f","Type":"ContainerDied","Data":"cfd74a4d5e0c74c0810797d2427e3c0958d75483bdf99da9d6e8e1c6bac07623"} Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.134392 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfd74a4d5e0c74c0810797d2427e3c0958d75483bdf99da9d6e8e1c6bac07623" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.134392 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-x75zk" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.136285 4898 generic.go:334] "Generic (PLEG): container finished" podID="bb2a6e17-835f-43f9-9b2b-eb5f39df5450" containerID="2dec706ec3d47e7f4d03ac7b64859e218da33cd45852b8891c58c2ae0bd97657" exitCode=0 Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.136696 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-85xgb" event={"ID":"bb2a6e17-835f-43f9-9b2b-eb5f39df5450","Type":"ContainerDied","Data":"2dec706ec3d47e7f4d03ac7b64859e218da33cd45852b8891c58c2ae0bd97657"} Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.136719 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-85xgb" event={"ID":"bb2a6e17-835f-43f9-9b2b-eb5f39df5450","Type":"ContainerStarted","Data":"181aa714130faecb2d84d29217d772e112b7d72b0fc2b841c35e29cd892d1122"} Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.144266 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"aafdfdd1a601b89501743e15f3469c5ea2c342f38447f1690153d7e24bba8022"} Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.144337 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"890e73f52dd9968314b64b76f20e6b3db9c84a625238bdd3b9a5c74ecf1248ce"} Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.144363 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"9a63ddefb3d41c2bfcbd0ca7b3b7e0b3cff83463706daf2b7ef2a74388e1a4ae"} Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.144378 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"28ab4729f099fc69a2484181e480e3100ae17275d794caca863b770eb3c5762e"} Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.145992 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.575491 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wnv9x"] Mar 13 14:20:14 crc kubenswrapper[4898]: E0313 14:20:14.577125 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d4aeca-15ec-4f63-87e0-20daa6f3e70f" containerName="glance-db-sync" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.577151 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d4aeca-15ec-4f63-87e0-20daa6f3e70f" containerName="glance-db-sync" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.577412 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d4aeca-15ec-4f63-87e0-20daa6f3e70f" containerName="glance-db-sync" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.578817 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.589135 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wnv9x"] Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.674776 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgnbz\" (UniqueName: \"kubernetes.io/projected/49fea8fc-372c-4cf8-a710-7fff58db294d-kube-api-access-zgnbz\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.674848 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.675232 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-config\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.675525 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.675558 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.779503 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.779590 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.779691 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-config\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.780592 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-config\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.780790 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.780848 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.781116 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgnbz\" (UniqueName: \"kubernetes.io/projected/49fea8fc-372c-4cf8-a710-7fff58db294d-kube-api-access-zgnbz\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.782022 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.782712 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.801913 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgnbz\" (UniqueName: \"kubernetes.io/projected/49fea8fc-372c-4cf8-a710-7fff58db294d-kube-api-access-zgnbz\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.904565 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:15 crc kubenswrapper[4898]: I0313 14:20:15.185032 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"28c14e620136a22405bc4f45510146b94e4e9107f878c32e3d9678b082910e5e"} Mar 13 14:20:15 crc kubenswrapper[4898]: I0313 14:20:15.501622 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wnv9x"] Mar 13 14:20:15 crc kubenswrapper[4898]: I0313 14:20:15.761285 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:15 crc kubenswrapper[4898]: I0313 14:20:15.915290 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-operator-scripts\") pod \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\" (UID: \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\") " Mar 13 14:20:15 crc kubenswrapper[4898]: I0313 14:20:15.915585 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkbcr\" (UniqueName: \"kubernetes.io/projected/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-kube-api-access-xkbcr\") pod \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\" (UID: \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\") " Mar 13 14:20:15 crc kubenswrapper[4898]: I0313 14:20:15.915919 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb2a6e17-835f-43f9-9b2b-eb5f39df5450" (UID: "bb2a6e17-835f-43f9-9b2b-eb5f39df5450"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:15 crc kubenswrapper[4898]: I0313 14:20:15.916264 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:15 crc kubenswrapper[4898]: I0313 14:20:15.920352 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-kube-api-access-xkbcr" (OuterVolumeSpecName: "kube-api-access-xkbcr") pod "bb2a6e17-835f-43f9-9b2b-eb5f39df5450" (UID: "bb2a6e17-835f-43f9-9b2b-eb5f39df5450"). InnerVolumeSpecName "kube-api-access-xkbcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:16 crc kubenswrapper[4898]: I0313 14:20:16.018613 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkbcr\" (UniqueName: \"kubernetes.io/projected/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-kube-api-access-xkbcr\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:16 crc kubenswrapper[4898]: I0313 14:20:16.211129 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-85xgb" event={"ID":"bb2a6e17-835f-43f9-9b2b-eb5f39df5450","Type":"ContainerDied","Data":"181aa714130faecb2d84d29217d772e112b7d72b0fc2b841c35e29cd892d1122"} Mar 13 14:20:16 crc kubenswrapper[4898]: I0313 14:20:16.211161 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:16 crc kubenswrapper[4898]: I0313 14:20:16.211168 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="181aa714130faecb2d84d29217d772e112b7d72b0fc2b841c35e29cd892d1122" Mar 13 14:20:16 crc kubenswrapper[4898]: I0313 14:20:16.212090 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" event={"ID":"49fea8fc-372c-4cf8-a710-7fff58db294d","Type":"ContainerStarted","Data":"bad9604b0b4948880032fa39bf17f3a7090e87a9e7998fc75541062cf197564a"} Mar 13 14:20:16 crc kubenswrapper[4898]: I0313 14:20:16.973766 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:20:17 crc kubenswrapper[4898]: I0313 14:20:17.132865 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 14:20:17 crc kubenswrapper[4898]: I0313 14:20:17.232459 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="prometheus" containerID="cri-o://97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d" gracePeriod=600 Mar 13 14:20:17 crc kubenswrapper[4898]: I0313 14:20:17.232521 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"68514ae839a261706159c6e729ae696ff8e3ed08553c87bdc766bf73a5e48096"} Mar 13 14:20:17 crc kubenswrapper[4898]: I0313 14:20:17.232532 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="config-reloader" containerID="cri-o://7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c" gracePeriod=600 Mar 13 14:20:17 crc kubenswrapper[4898]: I0313 14:20:17.232619 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="thanos-sidecar" containerID="cri-o://409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278" gracePeriod=600 Mar 13 14:20:17 crc kubenswrapper[4898]: I0313 14:20:17.276693 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.241731 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.244307 4898 generic.go:334] "Generic (PLEG): container finished" podID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerID="409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278" exitCode=0 Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.244353 4898 generic.go:334] "Generic (PLEG): container finished" podID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerID="7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c" exitCode=0 Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.244364 4898 generic.go:334] "Generic (PLEG): container finished" podID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerID="97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d" exitCode=0 Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.244424 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerDied","Data":"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278"} Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.244458 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerDied","Data":"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c"} Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.244470 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerDied","Data":"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d"} Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.244481 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerDied","Data":"0b0a8b36ebbdd732871a36ff41874dd5aa3cc4a1c6906bb4c6b8440e7e4c245d"} Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.244500 4898 scope.go:117] "RemoveContainer" containerID="409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.250150 4898 generic.go:334] "Generic (PLEG): container finished" podID="49fea8fc-372c-4cf8-a710-7fff58db294d" containerID="f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb" exitCode=0 Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.250220 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" event={"ID":"49fea8fc-372c-4cf8-a710-7fff58db294d","Type":"ContainerDied","Data":"f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb"} Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.276144 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-0\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.276194 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-web-config\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.276215 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-2\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.276274 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jxl2\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-kube-api-access-5jxl2\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.276343 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.276397 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-thanos-prometheus-http-client-file\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.276418 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-1\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.276466 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-tls-assets\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.277563 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.277644 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config-out\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.279698 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.280399 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"37bfb67d8b8561314d97aa77ffcf7c283a19fa233a2ac0c014fa13378475f9db"} Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.287085 4898 scope.go:117] "RemoveContainer" containerID="7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.287391 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.287977 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.288387 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config-out" (OuterVolumeSpecName: "config-out") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.288548 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config" (OuterVolumeSpecName: "config") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.290807 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-kube-api-access-5jxl2" (OuterVolumeSpecName: "kube-api-access-5jxl2") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "kube-api-access-5jxl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.291545 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.291749 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.314442 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-web-config" (OuterVolumeSpecName: "web-config") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.323850 4898 scope.go:117] "RemoveContainer" containerID="97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.350096 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "pvc-537e992e-0c7e-4e28-8105-b535a72a793c". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.379656 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=39.892313837 podStartE2EDuration="46.379635595s" podCreationTimestamp="2026-03-13 14:19:32 +0000 UTC" firstStartedPulling="2026-03-13 14:20:06.706367342 +0000 UTC m=+1441.707955581" lastFinishedPulling="2026-03-13 14:20:13.1936891 +0000 UTC m=+1448.195277339" observedRunningTime="2026-03-13 14:20:18.377625413 +0000 UTC m=+1453.379213652" watchObservedRunningTime="2026-03-13 14:20:18.379635595 +0000 UTC m=+1453.381223844" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.381227 4898 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386298 4898 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-web-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386320 4898 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386338 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jxl2\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-kube-api-access-5jxl2\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386352 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386364 4898 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386376 4898 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386388 4898 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386417 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") on node \"crc\" " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386433 4898 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config-out\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.419153 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.419328 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-537e992e-0c7e-4e28-8105-b535a72a793c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c") on node "crc" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.488317 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.504097 4898 scope.go:117] "RemoveContainer" containerID="285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.542268 4898 scope.go:117] "RemoveContainer" containerID="409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278" Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.542757 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278\": container with ID starting with 409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278 not found: ID does not exist" containerID="409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.542795 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278"} err="failed to get container status \"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278\": rpc error: code = NotFound desc = could not find container \"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278\": container with ID starting with 409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278 not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.542821 4898 scope.go:117] "RemoveContainer" containerID="7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c" Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.543280 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c\": container with ID starting with 7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c not found: ID does not exist" containerID="7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.543309 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c"} err="failed to get container status \"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c\": rpc error: code = NotFound desc = could not find container \"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c\": container with ID starting with 7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.543329 4898 scope.go:117] "RemoveContainer" containerID="97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d" Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.543629 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d\": container with ID starting with 97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d not found: ID does not exist" containerID="97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.543726 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d"} err="failed to get container status \"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d\": rpc error: code = NotFound desc = could not find container \"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d\": container with ID starting with 97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.543800 4898 scope.go:117] "RemoveContainer" containerID="285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b" Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.544169 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b\": container with ID starting with 285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b not found: ID does not exist" containerID="285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.544204 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b"} err="failed to get container status \"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b\": rpc error: code = NotFound desc = could not find container \"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b\": container with ID starting with 285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.544223 4898 scope.go:117] "RemoveContainer" containerID="409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.544512 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278"} err="failed to get container status \"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278\": rpc error: code = NotFound desc = could not find container \"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278\": container with ID starting with 409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278 not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.544600 4898 scope.go:117] "RemoveContainer" containerID="7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.544947 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c"} err="failed to get container status \"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c\": rpc error: code = NotFound desc = could not find container \"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c\": container with ID starting with 7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.545001 4898 scope.go:117] "RemoveContainer" containerID="97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.545392 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d"} err="failed to get container status \"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d\": rpc error: code = NotFound desc = could not find container \"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d\": container with ID starting with 97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.545412 4898 scope.go:117] "RemoveContainer" containerID="285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.545694 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b"} err="failed to get container status \"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b\": rpc error: code = NotFound desc = could not find container \"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b\": container with ID starting with 285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.545725 4898 scope.go:117] "RemoveContainer" containerID="409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.546043 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278"} err="failed to get container status \"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278\": rpc error: code = NotFound desc = could not find container \"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278\": container with ID starting with 409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278 not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.546062 4898 scope.go:117] "RemoveContainer" containerID="7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.546324 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c"} err="failed to get container status \"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c\": rpc error: code = NotFound desc = could not find container \"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c\": container with ID starting with 7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.546406 4898 scope.go:117] "RemoveContainer" containerID="97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.546816 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d"} err="failed to get container status \"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d\": rpc error: code = NotFound desc = could not find container \"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d\": container with ID starting with 97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.546845 4898 scope.go:117] "RemoveContainer" containerID="285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.547125 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b"} err="failed to get container status \"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b\": rpc error: code = NotFound desc = could not find container \"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b\": container with ID starting with 285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.719494 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wnv9x"] Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.766924 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-cz6vf"] Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.767319 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="init-config-reloader" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767337 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="init-config-reloader" Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.767366 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="thanos-sidecar" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767374 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="thanos-sidecar" Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.767387 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2a6e17-835f-43f9-9b2b-eb5f39df5450" containerName="mariadb-account-create-update" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767394 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2a6e17-835f-43f9-9b2b-eb5f39df5450" containerName="mariadb-account-create-update" Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.767404 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="config-reloader" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767410 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="config-reloader" Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.767426 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="prometheus" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767433 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="prometheus" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767601 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="prometheus" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767629 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="config-reloader" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767639 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="thanos-sidecar" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767649 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb2a6e17-835f-43f9-9b2b-eb5f39df5450" containerName="mariadb-account-create-update" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.768661 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.770683 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.788677 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-cz6vf"] Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.794973 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-config\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.795078 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dpzg\" (UniqueName: \"kubernetes.io/projected/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-kube-api-access-9dpzg\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.795116 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.795147 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.795239 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.795297 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.830049 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-85xgb"] Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.839671 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-85xgb"] Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.896994 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dpzg\" (UniqueName: \"kubernetes.io/projected/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-kube-api-access-9dpzg\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.897075 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.897115 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.897174 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.897248 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.897276 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-config\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.898187 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.898239 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.898242 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.898367 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-config\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.898368 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.921841 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dpzg\" (UniqueName: \"kubernetes.io/projected/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-kube-api-access-9dpzg\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.084360 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.306939 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.311568 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" event={"ID":"49fea8fc-372c-4cf8-a710-7fff58db294d","Type":"ContainerStarted","Data":"8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288"} Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.311607 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" podUID="49fea8fc-372c-4cf8-a710-7fff58db294d" containerName="dnsmasq-dns" containerID="cri-o://8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288" gracePeriod=10 Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.311640 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.351344 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" podStartSLOduration=5.351324979 podStartE2EDuration="5.351324979s" podCreationTimestamp="2026-03-13 14:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:19.341290389 +0000 UTC m=+1454.342878638" watchObservedRunningTime="2026-03-13 14:20:19.351324979 +0000 UTC m=+1454.352913208" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.403752 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.438624 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.474848 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.477993 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.486356 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.490399 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.491631 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-g7dw2" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.492172 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.492330 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.492420 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.492548 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.493007 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.495265 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.497925 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.608582 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-cz6vf"] Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619597 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv4p5\" (UniqueName: \"kubernetes.io/projected/d555bd54-f4d5-4b06-9517-32b4fe687f4b-kube-api-access-hv4p5\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619651 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d555bd54-f4d5-4b06-9517-32b4fe687f4b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619723 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619766 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619793 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619819 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619876 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d555bd54-f4d5-4b06-9517-32b4fe687f4b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619929 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619962 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.620176 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.620214 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.620237 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-config\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.620251 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.730775 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d555bd54-f4d5-4b06-9517-32b4fe687f4b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.730822 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.730862 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.730882 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.730939 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.730962 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-config\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.730978 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.731012 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv4p5\" (UniqueName: \"kubernetes.io/projected/d555bd54-f4d5-4b06-9517-32b4fe687f4b-kube-api-access-hv4p5\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.731031 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d555bd54-f4d5-4b06-9517-32b4fe687f4b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.731075 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.731108 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.731129 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.731151 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.749149 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.758288 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.759417 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d555bd54-f4d5-4b06-9517-32b4fe687f4b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.759882 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.760790 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.762250 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d555bd54-f4d5-4b06-9517-32b4fe687f4b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.817924 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.818929 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-config\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.819311 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.831592 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.832634 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.834377 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv4p5\" (UniqueName: \"kubernetes.io/projected/d555bd54-f4d5-4b06-9517-32b4fe687f4b-kube-api-access-hv4p5\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.975324 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" path="/var/lib/kubelet/pods/1e6f6f0d-db24-4fdb-a872-ce2c527a791b/volumes" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.976490 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb2a6e17-835f-43f9-9b2b-eb5f39df5450" path="/var/lib/kubelet/pods/bb2a6e17-835f-43f9-9b2b-eb5f39df5450/volumes" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.998699 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.998964 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7123a2111c2c1fcd673a2fa4cbaef2c14fcdb159a9a269edbe99c5cdea18ee2d/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.027010 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-88gdv"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.028459 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.071159 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.128711 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-88gdv"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.137513 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-config\") pod \"49fea8fc-372c-4cf8-a710-7fff58db294d\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.137624 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-sb\") pod \"49fea8fc-372c-4cf8-a710-7fff58db294d\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.137842 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-dns-svc\") pod \"49fea8fc-372c-4cf8-a710-7fff58db294d\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.137885 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-nb\") pod \"49fea8fc-372c-4cf8-a710-7fff58db294d\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.138005 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgnbz\" (UniqueName: \"kubernetes.io/projected/49fea8fc-372c-4cf8-a710-7fff58db294d-kube-api-access-zgnbz\") pod \"49fea8fc-372c-4cf8-a710-7fff58db294d\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.138246 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bzvr\" (UniqueName: \"kubernetes.io/projected/f8a8516c-5aee-4eae-a59b-498f97c1b92b-kube-api-access-2bzvr\") pod \"cinder-db-create-88gdv\" (UID: \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\") " pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.138404 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8a8516c-5aee-4eae-a59b-498f97c1b92b-operator-scripts\") pod \"cinder-db-create-88gdv\" (UID: \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\") " pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.217276 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4e00-account-create-update-92bgz"] Mar 13 14:20:20 crc kubenswrapper[4898]: E0313 14:20:20.217695 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fea8fc-372c-4cf8-a710-7fff58db294d" containerName="init" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.217707 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fea8fc-372c-4cf8-a710-7fff58db294d" containerName="init" Mar 13 14:20:20 crc kubenswrapper[4898]: E0313 14:20:20.217729 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fea8fc-372c-4cf8-a710-7fff58db294d" containerName="dnsmasq-dns" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.217736 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fea8fc-372c-4cf8-a710-7fff58db294d" containerName="dnsmasq-dns" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.217938 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fea8fc-372c-4cf8-a710-7fff58db294d" containerName="dnsmasq-dns" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.218613 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.225552 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.231302 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49fea8fc-372c-4cf8-a710-7fff58db294d-kube-api-access-zgnbz" (OuterVolumeSpecName: "kube-api-access-zgnbz") pod "49fea8fc-372c-4cf8-a710-7fff58db294d" (UID: "49fea8fc-372c-4cf8-a710-7fff58db294d"). InnerVolumeSpecName "kube-api-access-zgnbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.240161 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe3416e-f08a-43c9-8e12-a89c1e849208-operator-scripts\") pod \"cinder-4e00-account-create-update-92bgz\" (UID: \"4fe3416e-f08a-43c9-8e12-a89c1e849208\") " pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.240256 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8a8516c-5aee-4eae-a59b-498f97c1b92b-operator-scripts\") pod \"cinder-db-create-88gdv\" (UID: \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\") " pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.240314 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5xdf\" (UniqueName: \"kubernetes.io/projected/4fe3416e-f08a-43c9-8e12-a89c1e849208-kube-api-access-w5xdf\") pod \"cinder-4e00-account-create-update-92bgz\" (UID: \"4fe3416e-f08a-43c9-8e12-a89c1e849208\") " pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.240345 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bzvr\" (UniqueName: \"kubernetes.io/projected/f8a8516c-5aee-4eae-a59b-498f97c1b92b-kube-api-access-2bzvr\") pod \"cinder-db-create-88gdv\" (UID: \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\") " pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.240479 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgnbz\" (UniqueName: \"kubernetes.io/projected/49fea8fc-372c-4cf8-a710-7fff58db294d-kube-api-access-zgnbz\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.241393 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8a8516c-5aee-4eae-a59b-498f97c1b92b-operator-scripts\") pod \"cinder-db-create-88gdv\" (UID: \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\") " pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.281435 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bzvr\" (UniqueName: \"kubernetes.io/projected/f8a8516c-5aee-4eae-a59b-498f97c1b92b-kube-api-access-2bzvr\") pod \"cinder-db-create-88gdv\" (UID: \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\") " pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.308539 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4e00-account-create-update-92bgz"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.351771 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe3416e-f08a-43c9-8e12-a89c1e849208-operator-scripts\") pod \"cinder-4e00-account-create-update-92bgz\" (UID: \"4fe3416e-f08a-43c9-8e12-a89c1e849208\") " pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.352044 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5xdf\" (UniqueName: \"kubernetes.io/projected/4fe3416e-f08a-43c9-8e12-a89c1e849208-kube-api-access-w5xdf\") pod \"cinder-4e00-account-create-update-92bgz\" (UID: \"4fe3416e-f08a-43c9-8e12-a89c1e849208\") " pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.376621 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe3416e-f08a-43c9-8e12-a89c1e849208-operator-scripts\") pod \"cinder-4e00-account-create-update-92bgz\" (UID: \"4fe3416e-f08a-43c9-8e12-a89c1e849208\") " pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.382514 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.404357 4898 generic.go:334] "Generic (PLEG): container finished" podID="49fea8fc-372c-4cf8-a710-7fff58db294d" containerID="8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288" exitCode=0 Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.404428 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" event={"ID":"49fea8fc-372c-4cf8-a710-7fff58db294d","Type":"ContainerDied","Data":"8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288"} Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.404455 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" event={"ID":"49fea8fc-372c-4cf8-a710-7fff58db294d","Type":"ContainerDied","Data":"bad9604b0b4948880032fa39bf17f3a7090e87a9e7998fc75541062cf197564a"} Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.404471 4898 scope.go:117] "RemoveContainer" containerID="8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.404602 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.431953 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-dxpl9"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.434598 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.442093 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.442305 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.442459 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tdc5n" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.445979 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.454788 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd476\" (UniqueName: \"kubernetes.io/projected/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-kube-api-access-bd476\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.454853 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-combined-ca-bundle\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.454874 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-config-data\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.461013 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" event={"ID":"9f1520e0-d7d9-4992-9ca5-1b2e98313d33","Type":"ContainerStarted","Data":"e9c150eab9ccbad529dce767553bfd6f06b4b8420400e4a2a3ec291dc3f4b819"} Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.490781 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dxpl9"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.499478 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5xdf\" (UniqueName: \"kubernetes.io/projected/4fe3416e-f08a-43c9-8e12-a89c1e849208-kube-api-access-w5xdf\") pod \"cinder-4e00-account-create-update-92bgz\" (UID: \"4fe3416e-f08a-43c9-8e12-a89c1e849208\") " pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.541137 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-275nk"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.542469 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-275nk" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.552425 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-275nk"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.563572 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.569282 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-combined-ca-bundle\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.569325 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-config-data\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.569600 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd476\" (UniqueName: \"kubernetes.io/projected/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-kube-api-access-bd476\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.595955 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-9d98-account-create-update-t77sf"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.598730 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.602399 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.608267 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-config-data\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.612265 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-9d98-account-create-update-t77sf"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.620419 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-combined-ca-bundle\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.625643 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd476\" (UniqueName: \"kubernetes.io/projected/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-kube-api-access-bd476\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.626890 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-95vbj"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.628104 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.640307 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-95vbj"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.668091 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1082-account-create-update-2jjkd"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.669940 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.672270 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.676989 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2wn7\" (UniqueName: \"kubernetes.io/projected/b04d3edd-a550-465a-9ef2-2cbea4126ceb-kube-api-access-p2wn7\") pod \"heat-db-create-275nk\" (UID: \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\") " pod="openstack/heat-db-create-275nk" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.677170 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa06f21-2d35-4d03-86b9-01d9354826da-operator-scripts\") pod \"heat-9d98-account-create-update-t77sf\" (UID: \"1aa06f21-2d35-4d03-86b9-01d9354826da\") " pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.677228 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sxnt\" (UniqueName: \"kubernetes.io/projected/1aa06f21-2d35-4d03-86b9-01d9354826da-kube-api-access-9sxnt\") pod \"heat-9d98-account-create-update-t77sf\" (UID: \"1aa06f21-2d35-4d03-86b9-01d9354826da\") " pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.677266 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crvxm\" (UniqueName: \"kubernetes.io/projected/b83b860f-ed6c-46b2-862a-fbda9af7dc89-kube-api-access-crvxm\") pod \"neutron-db-create-95vbj\" (UID: \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\") " pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.677327 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83b860f-ed6c-46b2-862a-fbda9af7dc89-operator-scripts\") pod \"neutron-db-create-95vbj\" (UID: \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\") " pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.678216 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b04d3edd-a550-465a-9ef2-2cbea4126ceb-operator-scripts\") pod \"heat-db-create-275nk\" (UID: \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\") " pod="openstack/heat-db-create-275nk" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.682008 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1082-account-create-update-2jjkd"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.722697 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-23f7-account-create-update-z479t"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.724611 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.730553 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.761020 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "49fea8fc-372c-4cf8-a710-7fff58db294d" (UID: "49fea8fc-372c-4cf8-a710-7fff58db294d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.775345 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-23f7-account-create-update-z479t"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.780313 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg2zd\" (UniqueName: \"kubernetes.io/projected/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-kube-api-access-xg2zd\") pod \"barbican-23f7-account-create-update-z479t\" (UID: \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\") " pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.780409 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2wn7\" (UniqueName: \"kubernetes.io/projected/b04d3edd-a550-465a-9ef2-2cbea4126ceb-kube-api-access-p2wn7\") pod \"heat-db-create-275nk\" (UID: \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\") " pod="openstack/heat-db-create-275nk" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.780522 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-operator-scripts\") pod \"barbican-23f7-account-create-update-z479t\" (UID: \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\") " pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.780560 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa06f21-2d35-4d03-86b9-01d9354826da-operator-scripts\") pod \"heat-9d98-account-create-update-t77sf\" (UID: \"1aa06f21-2d35-4d03-86b9-01d9354826da\") " pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.780602 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sxnt\" (UniqueName: \"kubernetes.io/projected/1aa06f21-2d35-4d03-86b9-01d9354826da-kube-api-access-9sxnt\") pod \"heat-9d98-account-create-update-t77sf\" (UID: \"1aa06f21-2d35-4d03-86b9-01d9354826da\") " pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.780817 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crvxm\" (UniqueName: \"kubernetes.io/projected/b83b860f-ed6c-46b2-862a-fbda9af7dc89-kube-api-access-crvxm\") pod \"neutron-db-create-95vbj\" (UID: \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\") " pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.780867 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lc97\" (UniqueName: \"kubernetes.io/projected/71459d1c-2acb-4e15-a30d-09dd0f7f7951-kube-api-access-4lc97\") pod \"neutron-1082-account-create-update-2jjkd\" (UID: \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\") " pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.781125 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83b860f-ed6c-46b2-862a-fbda9af7dc89-operator-scripts\") pod \"neutron-db-create-95vbj\" (UID: \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\") " pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.781233 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71459d1c-2acb-4e15-a30d-09dd0f7f7951-operator-scripts\") pod \"neutron-1082-account-create-update-2jjkd\" (UID: \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\") " pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.781400 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b04d3edd-a550-465a-9ef2-2cbea4126ceb-operator-scripts\") pod \"heat-db-create-275nk\" (UID: \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\") " pod="openstack/heat-db-create-275nk" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.781672 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.783466 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83b860f-ed6c-46b2-862a-fbda9af7dc89-operator-scripts\") pod \"neutron-db-create-95vbj\" (UID: \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\") " pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.784376 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa06f21-2d35-4d03-86b9-01d9354826da-operator-scripts\") pod \"heat-9d98-account-create-update-t77sf\" (UID: \"1aa06f21-2d35-4d03-86b9-01d9354826da\") " pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.796199 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b04d3edd-a550-465a-9ef2-2cbea4126ceb-operator-scripts\") pod \"heat-db-create-275nk\" (UID: \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\") " pod="openstack/heat-db-create-275nk" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.811335 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-452kj"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.813084 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-452kj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.819795 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crvxm\" (UniqueName: \"kubernetes.io/projected/b83b860f-ed6c-46b2-862a-fbda9af7dc89-kube-api-access-crvxm\") pod \"neutron-db-create-95vbj\" (UID: \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\") " pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.819989 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sxnt\" (UniqueName: \"kubernetes.io/projected/1aa06f21-2d35-4d03-86b9-01d9354826da-kube-api-access-9sxnt\") pod \"heat-9d98-account-create-update-t77sf\" (UID: \"1aa06f21-2d35-4d03-86b9-01d9354826da\") " pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.821557 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2wn7\" (UniqueName: \"kubernetes.io/projected/b04d3edd-a550-465a-9ef2-2cbea4126ceb-kube-api-access-p2wn7\") pod \"heat-db-create-275nk\" (UID: \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\") " pod="openstack/heat-db-create-275nk" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.833026 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-452kj"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.846831 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.861753 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-config" (OuterVolumeSpecName: "config") pod "49fea8fc-372c-4cf8-a710-7fff58db294d" (UID: "49fea8fc-372c-4cf8-a710-7fff58db294d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.870211 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "49fea8fc-372c-4cf8-a710-7fff58db294d" (UID: "49fea8fc-372c-4cf8-a710-7fff58db294d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.886227 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-operator-scripts\") pod \"barbican-23f7-account-create-update-z479t\" (UID: \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\") " pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.886290 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkkr5\" (UniqueName: \"kubernetes.io/projected/bea88065-1eff-42e2-809a-443c15bda0ac-kube-api-access-xkkr5\") pod \"barbican-db-create-452kj\" (UID: \"bea88065-1eff-42e2-809a-443c15bda0ac\") " pod="openstack/barbican-db-create-452kj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.886364 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lc97\" (UniqueName: \"kubernetes.io/projected/71459d1c-2acb-4e15-a30d-09dd0f7f7951-kube-api-access-4lc97\") pod \"neutron-1082-account-create-update-2jjkd\" (UID: \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\") " pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.886423 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71459d1c-2acb-4e15-a30d-09dd0f7f7951-operator-scripts\") pod \"neutron-1082-account-create-update-2jjkd\" (UID: \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\") " pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.886456 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bea88065-1eff-42e2-809a-443c15bda0ac-operator-scripts\") pod \"barbican-db-create-452kj\" (UID: \"bea88065-1eff-42e2-809a-443c15bda0ac\") " pod="openstack/barbican-db-create-452kj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.886503 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg2zd\" (UniqueName: \"kubernetes.io/projected/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-kube-api-access-xg2zd\") pod \"barbican-23f7-account-create-update-z479t\" (UID: \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\") " pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.886635 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.886654 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.887088 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-operator-scripts\") pod \"barbican-23f7-account-create-update-z479t\" (UID: \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\") " pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.887713 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71459d1c-2acb-4e15-a30d-09dd0f7f7951-operator-scripts\") pod \"neutron-1082-account-create-update-2jjkd\" (UID: \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\") " pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.898652 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "49fea8fc-372c-4cf8-a710-7fff58db294d" (UID: "49fea8fc-372c-4cf8-a710-7fff58db294d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.940775 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lc97\" (UniqueName: \"kubernetes.io/projected/71459d1c-2acb-4e15-a30d-09dd0f7f7951-kube-api-access-4lc97\") pod \"neutron-1082-account-create-update-2jjkd\" (UID: \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\") " pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.957821 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg2zd\" (UniqueName: \"kubernetes.io/projected/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-kube-api-access-xg2zd\") pod \"barbican-23f7-account-create-update-z479t\" (UID: \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\") " pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.988143 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bea88065-1eff-42e2-809a-443c15bda0ac-operator-scripts\") pod \"barbican-db-create-452kj\" (UID: \"bea88065-1eff-42e2-809a-443c15bda0ac\") " pod="openstack/barbican-db-create-452kj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.988706 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkkr5\" (UniqueName: \"kubernetes.io/projected/bea88065-1eff-42e2-809a-443c15bda0ac-kube-api-access-xkkr5\") pod \"barbican-db-create-452kj\" (UID: \"bea88065-1eff-42e2-809a-443c15bda0ac\") " pod="openstack/barbican-db-create-452kj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.988934 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.989033 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bea88065-1eff-42e2-809a-443c15bda0ac-operator-scripts\") pod \"barbican-db-create-452kj\" (UID: \"bea88065-1eff-42e2-809a-443c15bda0ac\") " pod="openstack/barbican-db-create-452kj" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.001852 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.008349 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkkr5\" (UniqueName: \"kubernetes.io/projected/bea88065-1eff-42e2-809a-443c15bda0ac-kube-api-access-xkkr5\") pod \"barbican-db-create-452kj\" (UID: \"bea88065-1eff-42e2-809a-443c15bda0ac\") " pod="openstack/barbican-db-create-452kj" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.024155 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4e00-account-create-update-92bgz"] Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.122448 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.123145 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wnv9x"] Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.132284 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wnv9x"] Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.143259 4898 scope.go:117] "RemoveContainer" containerID="f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.151997 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-275nk" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.181690 4898 scope.go:117] "RemoveContainer" containerID="8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288" Mar 13 14:20:21 crc kubenswrapper[4898]: E0313 14:20:21.185701 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288\": container with ID starting with 8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288 not found: ID does not exist" containerID="8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.185739 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288"} err="failed to get container status \"8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288\": rpc error: code = NotFound desc = could not find container \"8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288\": container with ID starting with 8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288 not found: ID does not exist" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.185761 4898 scope.go:117] "RemoveContainer" containerID="f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb" Mar 13 14:20:21 crc kubenswrapper[4898]: E0313 14:20:21.186949 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb\": container with ID starting with f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb not found: ID does not exist" containerID="f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.187008 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb"} err="failed to get container status \"f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb\": rpc error: code = NotFound desc = could not find container \"f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb\": container with ID starting with f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb not found: ID does not exist" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.196977 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.199144 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.219345 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.264265 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.282175 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-452kj" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.309842 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-88gdv"] Mar 13 14:20:21 crc kubenswrapper[4898]: W0313 14:20:21.325047 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8a8516c_5aee_4eae_a59b_498f97c1b92b.slice/crio-0f13e48b7821712cb1bba68d5ed2ca19eb5fbddf9b7a29ac27b157d895fe2e3c WatchSource:0}: Error finding container 0f13e48b7821712cb1bba68d5ed2ca19eb5fbddf9b7a29ac27b157d895fe2e3c: Status 404 returned error can't find the container with id 0f13e48b7821712cb1bba68d5ed2ca19eb5fbddf9b7a29ac27b157d895fe2e3c Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.520498 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-88gdv" event={"ID":"f8a8516c-5aee-4eae-a59b-498f97c1b92b","Type":"ContainerStarted","Data":"0f13e48b7821712cb1bba68d5ed2ca19eb5fbddf9b7a29ac27b157d895fe2e3c"} Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.530938 4898 generic.go:334] "Generic (PLEG): container finished" podID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerID="d7d9b7775bd2555a7c4636350359292fb8c65661ac123c18de4ec1ec3c6ad5d5" exitCode=0 Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.531030 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" event={"ID":"9f1520e0-d7d9-4992-9ca5-1b2e98313d33","Type":"ContainerDied","Data":"d7d9b7775bd2555a7c4636350359292fb8c65661ac123c18de4ec1ec3c6ad5d5"} Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.542932 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4e00-account-create-update-92bgz" event={"ID":"4fe3416e-f08a-43c9-8e12-a89c1e849208","Type":"ContainerStarted","Data":"c12b1a7614a831f1604c6feaf4bf42c52496fffc128594a26f7d1a8c71f58636"} Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.542970 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4e00-account-create-update-92bgz" event={"ID":"4fe3416e-f08a-43c9-8e12-a89c1e849208","Type":"ContainerStarted","Data":"bb908e3bf15681fcffb32fcf09fd96e847d58d3dc20f8731e8a57f7767257538"} Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.624266 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-4e00-account-create-update-92bgz" podStartSLOduration=2.624246574 podStartE2EDuration="2.624246574s" podCreationTimestamp="2026-03-13 14:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:21.602344465 +0000 UTC m=+1456.603932724" watchObservedRunningTime="2026-03-13 14:20:21.624246574 +0000 UTC m=+1456.625834813" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.704466 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.769055 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49fea8fc-372c-4cf8-a710-7fff58db294d" path="/var/lib/kubelet/pods/49fea8fc-372c-4cf8-a710-7fff58db294d/volumes" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.811012 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dxpl9"] Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.929486 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-275nk"] Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.118308 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-9d98-account-create-update-t77sf"] Mar 13 14:20:22 crc kubenswrapper[4898]: E0313 14:20:22.346583 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fe3416e_f08a_43c9_8e12_a89c1e849208.slice/crio-c12b1a7614a831f1604c6feaf4bf42c52496fffc128594a26f7d1a8c71f58636.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fe3416e_f08a_43c9_8e12_a89c1e849208.slice/crio-conmon-c12b1a7614a831f1604c6feaf4bf42c52496fffc128594a26f7d1a8c71f58636.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8a8516c_5aee_4eae_a59b_498f97c1b92b.slice/crio-conmon-4e72b0c7dc05dd72f43622aacb47475aacbf01f3e30ece85d27e66c47011712e.scope\": RecentStats: unable to find data in memory cache]" Mar 13 14:20:22 crc kubenswrapper[4898]: W0313 14:20:22.380655 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbea88065_1eff_42e2_809a_443c15bda0ac.slice/crio-c2d35d9d1f6a33fbdc7f1c2681ee05eeaa35e95f2c8bc22309c279e2bc914c17 WatchSource:0}: Error finding container c2d35d9d1f6a33fbdc7f1c2681ee05eeaa35e95f2c8bc22309c279e2bc914c17: Status 404 returned error can't find the container with id c2d35d9d1f6a33fbdc7f1c2681ee05eeaa35e95f2c8bc22309c279e2bc914c17 Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.394977 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-452kj"] Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.426085 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-95vbj"] Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.548724 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-23f7-account-create-update-z479t"] Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.567611 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1082-account-create-update-2jjkd"] Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.570760 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-95vbj" event={"ID":"b83b860f-ed6c-46b2-862a-fbda9af7dc89","Type":"ContainerStarted","Data":"a7fdcb25a0c057c7c997cd177fd4771ba973fb7faff5931de0e11cf08d037dc6"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.579562 4898 generic.go:334] "Generic (PLEG): container finished" podID="b04d3edd-a550-465a-9ef2-2cbea4126ceb" containerID="860f247abf99986a680fa9cbd71b3ddb7e0e1a4bc671f3a1ca2277312ff69005" exitCode=0 Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.579643 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-275nk" event={"ID":"b04d3edd-a550-465a-9ef2-2cbea4126ceb","Type":"ContainerDied","Data":"860f247abf99986a680fa9cbd71b3ddb7e0e1a4bc671f3a1ca2277312ff69005"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.579669 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-275nk" event={"ID":"b04d3edd-a550-465a-9ef2-2cbea4126ceb","Type":"ContainerStarted","Data":"95bbd9bdb5e436c19160a59956edc16f6c2aa59b93b3dae88e1b3df084a217fb"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.587727 4898 generic.go:334] "Generic (PLEG): container finished" podID="f8a8516c-5aee-4eae-a59b-498f97c1b92b" containerID="4e72b0c7dc05dd72f43622aacb47475aacbf01f3e30ece85d27e66c47011712e" exitCode=0 Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.587828 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-88gdv" event={"ID":"f8a8516c-5aee-4eae-a59b-498f97c1b92b","Type":"ContainerDied","Data":"4e72b0c7dc05dd72f43622aacb47475aacbf01f3e30ece85d27e66c47011712e"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.604633 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9d98-account-create-update-t77sf" event={"ID":"1aa06f21-2d35-4d03-86b9-01d9354826da","Type":"ContainerStarted","Data":"5f41cf46e25c4bfb4c3cb12738ce8eaa95c275888249cc79416494629ec3b64b"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.604702 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9d98-account-create-update-t77sf" event={"ID":"1aa06f21-2d35-4d03-86b9-01d9354826da","Type":"ContainerStarted","Data":"32ad0f4bc955a1c213252ea528d35e7b749f8d4ec4d59c71217cfbe39f2386ae"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.607716 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" event={"ID":"9f1520e0-d7d9-4992-9ca5-1b2e98313d33","Type":"ContainerStarted","Data":"ce178a8ab6289ecc30ada8030035df3e75b119b24ae01060c42a05e394597ae3"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.607881 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.612491 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dxpl9" event={"ID":"8fdab36c-41db-4a9c-9cbe-47e1761c6df5","Type":"ContainerStarted","Data":"4a05de9aa49849a88ec18bd7a217b2e7749103f27e3a95daf9185dac7265effb"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.619748 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d555bd54-f4d5-4b06-9517-32b4fe687f4b","Type":"ContainerStarted","Data":"394a9c444dfa8cc8c8fe8e8d08721ac7bdec485308f43f3ae940f284836aa510"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.623132 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-452kj" event={"ID":"bea88065-1eff-42e2-809a-443c15bda0ac","Type":"ContainerStarted","Data":"c2d35d9d1f6a33fbdc7f1c2681ee05eeaa35e95f2c8bc22309c279e2bc914c17"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.625997 4898 generic.go:334] "Generic (PLEG): container finished" podID="4fe3416e-f08a-43c9-8e12-a89c1e849208" containerID="c12b1a7614a831f1604c6feaf4bf42c52496fffc128594a26f7d1a8c71f58636" exitCode=0 Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.626044 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4e00-account-create-update-92bgz" event={"ID":"4fe3416e-f08a-43c9-8e12-a89c1e849208","Type":"ContainerDied","Data":"c12b1a7614a831f1604c6feaf4bf42c52496fffc128594a26f7d1a8c71f58636"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.654397 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" podStartSLOduration=4.654382405 podStartE2EDuration="4.654382405s" podCreationTimestamp="2026-03-13 14:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:22.647598169 +0000 UTC m=+1457.649186418" watchObservedRunningTime="2026-03-13 14:20:22.654382405 +0000 UTC m=+1457.655970644" Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.685645 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-9d98-account-create-update-t77sf" podStartSLOduration=2.685625186 podStartE2EDuration="2.685625186s" podCreationTimestamp="2026-03-13 14:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:22.665951015 +0000 UTC m=+1457.667539264" watchObservedRunningTime="2026-03-13 14:20:22.685625186 +0000 UTC m=+1457.687213425" Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.655332 4898 generic.go:334] "Generic (PLEG): container finished" podID="b83b860f-ed6c-46b2-862a-fbda9af7dc89" containerID="3c9e54e03326bbf4751dd95fa7e9b1825e9af82b1eefdf09759081304dd57de9" exitCode=0 Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.655426 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-95vbj" event={"ID":"b83b860f-ed6c-46b2-862a-fbda9af7dc89","Type":"ContainerDied","Data":"3c9e54e03326bbf4751dd95fa7e9b1825e9af82b1eefdf09759081304dd57de9"} Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.657687 4898 generic.go:334] "Generic (PLEG): container finished" podID="bea88065-1eff-42e2-809a-443c15bda0ac" containerID="9d4412724b9aeab2fb38e3b120d6b80b5959d7a8a33631247f92a023f5b56a70" exitCode=0 Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.657735 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-452kj" event={"ID":"bea88065-1eff-42e2-809a-443c15bda0ac","Type":"ContainerDied","Data":"9d4412724b9aeab2fb38e3b120d6b80b5959d7a8a33631247f92a023f5b56a70"} Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.662851 4898 generic.go:334] "Generic (PLEG): container finished" podID="71459d1c-2acb-4e15-a30d-09dd0f7f7951" containerID="7890927e1d3da3f2b5ae266b74631a44cf3eea829b7cc9b79f5ffe9476b7f6a0" exitCode=0 Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.662945 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1082-account-create-update-2jjkd" event={"ID":"71459d1c-2acb-4e15-a30d-09dd0f7f7951","Type":"ContainerDied","Data":"7890927e1d3da3f2b5ae266b74631a44cf3eea829b7cc9b79f5ffe9476b7f6a0"} Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.662971 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1082-account-create-update-2jjkd" event={"ID":"71459d1c-2acb-4e15-a30d-09dd0f7f7951","Type":"ContainerStarted","Data":"fb97c5d92d819436e097d301a10d6e1823950e95217ea57b4fa7ae57304a395a"} Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.664357 4898 generic.go:334] "Generic (PLEG): container finished" podID="1aa06f21-2d35-4d03-86b9-01d9354826da" containerID="5f41cf46e25c4bfb4c3cb12738ce8eaa95c275888249cc79416494629ec3b64b" exitCode=0 Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.664398 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9d98-account-create-update-t77sf" event={"ID":"1aa06f21-2d35-4d03-86b9-01d9354826da","Type":"ContainerDied","Data":"5f41cf46e25c4bfb4c3cb12738ce8eaa95c275888249cc79416494629ec3b64b"} Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.665977 4898 generic.go:334] "Generic (PLEG): container finished" podID="f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d" containerID="89083a9a998b87f99f34c5645b2e669a886f2f7b20e68795829134c5acb24ac6" exitCode=0 Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.666020 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-23f7-account-create-update-z479t" event={"ID":"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d","Type":"ContainerDied","Data":"89083a9a998b87f99f34c5645b2e669a886f2f7b20e68795829134c5acb24ac6"} Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.666035 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-23f7-account-create-update-z479t" event={"ID":"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d","Type":"ContainerStarted","Data":"49bd09daaa7c514816a284920d5932365e68f9d24f10691ae4161b7348f99fc9"} Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.883068 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-pxtss"] Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.884942 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.886830 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.891256 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pxtss"] Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.990652 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4tzj\" (UniqueName: \"kubernetes.io/projected/74eb351d-364c-4564-8f8b-67ac844a6abc-kube-api-access-f4tzj\") pod \"root-account-create-update-pxtss\" (UID: \"74eb351d-364c-4564-8f8b-67ac844a6abc\") " pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.991176 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74eb351d-364c-4564-8f8b-67ac844a6abc-operator-scripts\") pod \"root-account-create-update-pxtss\" (UID: \"74eb351d-364c-4564-8f8b-67ac844a6abc\") " pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.093371 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74eb351d-364c-4564-8f8b-67ac844a6abc-operator-scripts\") pod \"root-account-create-update-pxtss\" (UID: \"74eb351d-364c-4564-8f8b-67ac844a6abc\") " pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.093465 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4tzj\" (UniqueName: \"kubernetes.io/projected/74eb351d-364c-4564-8f8b-67ac844a6abc-kube-api-access-f4tzj\") pod \"root-account-create-update-pxtss\" (UID: \"74eb351d-364c-4564-8f8b-67ac844a6abc\") " pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.093996 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74eb351d-364c-4564-8f8b-67ac844a6abc-operator-scripts\") pod \"root-account-create-update-pxtss\" (UID: \"74eb351d-364c-4564-8f8b-67ac844a6abc\") " pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.238428 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4tzj\" (UniqueName: \"kubernetes.io/projected/74eb351d-364c-4564-8f8b-67ac844a6abc-kube-api-access-f4tzj\") pod \"root-account-create-update-pxtss\" (UID: \"74eb351d-364c-4564-8f8b-67ac844a6abc\") " pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.368010 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.374681 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.393039 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-275nk" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.500613 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5xdf\" (UniqueName: \"kubernetes.io/projected/4fe3416e-f08a-43c9-8e12-a89c1e849208-kube-api-access-w5xdf\") pod \"4fe3416e-f08a-43c9-8e12-a89c1e849208\" (UID: \"4fe3416e-f08a-43c9-8e12-a89c1e849208\") " Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.500702 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b04d3edd-a550-465a-9ef2-2cbea4126ceb-operator-scripts\") pod \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\" (UID: \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\") " Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.500777 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe3416e-f08a-43c9-8e12-a89c1e849208-operator-scripts\") pod \"4fe3416e-f08a-43c9-8e12-a89c1e849208\" (UID: \"4fe3416e-f08a-43c9-8e12-a89c1e849208\") " Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.500816 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2wn7\" (UniqueName: \"kubernetes.io/projected/b04d3edd-a550-465a-9ef2-2cbea4126ceb-kube-api-access-p2wn7\") pod \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\" (UID: \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\") " Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.500867 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8a8516c-5aee-4eae-a59b-498f97c1b92b-operator-scripts\") pod \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\" (UID: \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\") " Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.500923 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bzvr\" (UniqueName: \"kubernetes.io/projected/f8a8516c-5aee-4eae-a59b-498f97c1b92b-kube-api-access-2bzvr\") pod \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\" (UID: \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\") " Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.501488 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fe3416e-f08a-43c9-8e12-a89c1e849208-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4fe3416e-f08a-43c9-8e12-a89c1e849208" (UID: "4fe3416e-f08a-43c9-8e12-a89c1e849208"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.501485 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b04d3edd-a550-465a-9ef2-2cbea4126ceb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b04d3edd-a550-465a-9ef2-2cbea4126ceb" (UID: "b04d3edd-a550-465a-9ef2-2cbea4126ceb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.501557 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8a8516c-5aee-4eae-a59b-498f97c1b92b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8a8516c-5aee-4eae-a59b-498f97c1b92b" (UID: "f8a8516c-5aee-4eae-a59b-498f97c1b92b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.508288 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b04d3edd-a550-465a-9ef2-2cbea4126ceb-kube-api-access-p2wn7" (OuterVolumeSpecName: "kube-api-access-p2wn7") pod "b04d3edd-a550-465a-9ef2-2cbea4126ceb" (UID: "b04d3edd-a550-465a-9ef2-2cbea4126ceb"). InnerVolumeSpecName "kube-api-access-p2wn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.508644 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.508782 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8a8516c-5aee-4eae-a59b-498f97c1b92b-kube-api-access-2bzvr" (OuterVolumeSpecName: "kube-api-access-2bzvr") pod "f8a8516c-5aee-4eae-a59b-498f97c1b92b" (UID: "f8a8516c-5aee-4eae-a59b-498f97c1b92b"). InnerVolumeSpecName "kube-api-access-2bzvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.532053 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fe3416e-f08a-43c9-8e12-a89c1e849208-kube-api-access-w5xdf" (OuterVolumeSpecName: "kube-api-access-w5xdf") pod "4fe3416e-f08a-43c9-8e12-a89c1e849208" (UID: "4fe3416e-f08a-43c9-8e12-a89c1e849208"). InnerVolumeSpecName "kube-api-access-w5xdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.603158 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b04d3edd-a550-465a-9ef2-2cbea4126ceb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.603205 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe3416e-f08a-43c9-8e12-a89c1e849208-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.603219 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2wn7\" (UniqueName: \"kubernetes.io/projected/b04d3edd-a550-465a-9ef2-2cbea4126ceb-kube-api-access-p2wn7\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.603238 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8a8516c-5aee-4eae-a59b-498f97c1b92b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.603252 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bzvr\" (UniqueName: \"kubernetes.io/projected/f8a8516c-5aee-4eae-a59b-498f97c1b92b-kube-api-access-2bzvr\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.603265 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5xdf\" (UniqueName: \"kubernetes.io/projected/4fe3416e-f08a-43c9-8e12-a89c1e849208-kube-api-access-w5xdf\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.679584 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.679535 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4e00-account-create-update-92bgz" event={"ID":"4fe3416e-f08a-43c9-8e12-a89c1e849208","Type":"ContainerDied","Data":"bb908e3bf15681fcffb32fcf09fd96e847d58d3dc20f8731e8a57f7767257538"} Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.679715 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb908e3bf15681fcffb32fcf09fd96e847d58d3dc20f8731e8a57f7767257538" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.682329 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-88gdv" event={"ID":"f8a8516c-5aee-4eae-a59b-498f97c1b92b","Type":"ContainerDied","Data":"0f13e48b7821712cb1bba68d5ed2ca19eb5fbddf9b7a29ac27b157d895fe2e3c"} Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.682358 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f13e48b7821712cb1bba68d5ed2ca19eb5fbddf9b7a29ac27b157d895fe2e3c" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.682398 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.694983 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d555bd54-f4d5-4b06-9517-32b4fe687f4b","Type":"ContainerStarted","Data":"3c90dc434ae18a1a3a436a05040e009b821f261e42f9834113cbf08fabe109e3"} Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.700435 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-275nk" event={"ID":"b04d3edd-a550-465a-9ef2-2cbea4126ceb","Type":"ContainerDied","Data":"95bbd9bdb5e436c19160a59956edc16f6c2aa59b93b3dae88e1b3df084a217fb"} Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.700481 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95bbd9bdb5e436c19160a59956edc16f6c2aa59b93b3dae88e1b3df084a217fb" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.700503 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-275nk" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.659449 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.664356 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.680361 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-452kj" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.708085 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.721729 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.751757 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sxnt\" (UniqueName: \"kubernetes.io/projected/1aa06f21-2d35-4d03-86b9-01d9354826da-kube-api-access-9sxnt\") pod \"1aa06f21-2d35-4d03-86b9-01d9354826da\" (UID: \"1aa06f21-2d35-4d03-86b9-01d9354826da\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.751979 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkkr5\" (UniqueName: \"kubernetes.io/projected/bea88065-1eff-42e2-809a-443c15bda0ac-kube-api-access-xkkr5\") pod \"bea88065-1eff-42e2-809a-443c15bda0ac\" (UID: \"bea88065-1eff-42e2-809a-443c15bda0ac\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.752096 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa06f21-2d35-4d03-86b9-01d9354826da-operator-scripts\") pod \"1aa06f21-2d35-4d03-86b9-01d9354826da\" (UID: \"1aa06f21-2d35-4d03-86b9-01d9354826da\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.752147 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lc97\" (UniqueName: \"kubernetes.io/projected/71459d1c-2acb-4e15-a30d-09dd0f7f7951-kube-api-access-4lc97\") pod \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\" (UID: \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.752197 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83b860f-ed6c-46b2-862a-fbda9af7dc89-operator-scripts\") pod \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\" (UID: \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.752265 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg2zd\" (UniqueName: \"kubernetes.io/projected/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-kube-api-access-xg2zd\") pod \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\" (UID: \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.752389 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-operator-scripts\") pod \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\" (UID: \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.752471 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crvxm\" (UniqueName: \"kubernetes.io/projected/b83b860f-ed6c-46b2-862a-fbda9af7dc89-kube-api-access-crvxm\") pod \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\" (UID: \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.752574 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71459d1c-2acb-4e15-a30d-09dd0f7f7951-operator-scripts\") pod \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\" (UID: \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.752662 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bea88065-1eff-42e2-809a-443c15bda0ac-operator-scripts\") pod \"bea88065-1eff-42e2-809a-443c15bda0ac\" (UID: \"bea88065-1eff-42e2-809a-443c15bda0ac\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.754760 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aa06f21-2d35-4d03-86b9-01d9354826da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1aa06f21-2d35-4d03-86b9-01d9354826da" (UID: "1aa06f21-2d35-4d03-86b9-01d9354826da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.755193 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d" (UID: "f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.756550 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b83b860f-ed6c-46b2-862a-fbda9af7dc89-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b83b860f-ed6c-46b2-862a-fbda9af7dc89" (UID: "b83b860f-ed6c-46b2-862a-fbda9af7dc89"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.759489 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bea88065-1eff-42e2-809a-443c15bda0ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bea88065-1eff-42e2-809a-443c15bda0ac" (UID: "bea88065-1eff-42e2-809a-443c15bda0ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.762450 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bea88065-1eff-42e2-809a-443c15bda0ac-kube-api-access-xkkr5" (OuterVolumeSpecName: "kube-api-access-xkkr5") pod "bea88065-1eff-42e2-809a-443c15bda0ac" (UID: "bea88065-1eff-42e2-809a-443c15bda0ac"). InnerVolumeSpecName "kube-api-access-xkkr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.762618 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-kube-api-access-xg2zd" (OuterVolumeSpecName: "kube-api-access-xg2zd") pod "f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d" (UID: "f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d"). InnerVolumeSpecName "kube-api-access-xg2zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.768627 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkkr5\" (UniqueName: \"kubernetes.io/projected/bea88065-1eff-42e2-809a-443c15bda0ac-kube-api-access-xkkr5\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.768664 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa06f21-2d35-4d03-86b9-01d9354826da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.768685 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83b860f-ed6c-46b2-862a-fbda9af7dc89-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.768697 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg2zd\" (UniqueName: \"kubernetes.io/projected/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-kube-api-access-xg2zd\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.768710 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.768723 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bea88065-1eff-42e2-809a-443c15bda0ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.774103 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-452kj" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.774174 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-452kj" event={"ID":"bea88065-1eff-42e2-809a-443c15bda0ac","Type":"ContainerDied","Data":"c2d35d9d1f6a33fbdc7f1c2681ee05eeaa35e95f2c8bc22309c279e2bc914c17"} Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.774219 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2d35d9d1f6a33fbdc7f1c2681ee05eeaa35e95f2c8bc22309c279e2bc914c17" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.774574 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b83b860f-ed6c-46b2-862a-fbda9af7dc89-kube-api-access-crvxm" (OuterVolumeSpecName: "kube-api-access-crvxm") pod "b83b860f-ed6c-46b2-862a-fbda9af7dc89" (UID: "b83b860f-ed6c-46b2-862a-fbda9af7dc89"). InnerVolumeSpecName "kube-api-access-crvxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.778748 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.781179 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.782174 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71459d1c-2acb-4e15-a30d-09dd0f7f7951-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71459d1c-2acb-4e15-a30d-09dd0f7f7951" (UID: "71459d1c-2acb-4e15-a30d-09dd0f7f7951"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.783912 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1082-account-create-update-2jjkd" event={"ID":"71459d1c-2acb-4e15-a30d-09dd0f7f7951","Type":"ContainerDied","Data":"fb97c5d92d819436e097d301a10d6e1823950e95217ea57b4fa7ae57304a395a"} Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.783948 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb97c5d92d819436e097d301a10d6e1823950e95217ea57b4fa7ae57304a395a" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.783998 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9d98-account-create-update-t77sf" event={"ID":"1aa06f21-2d35-4d03-86b9-01d9354826da","Type":"ContainerDied","Data":"32ad0f4bc955a1c213252ea528d35e7b749f8d4ec4d59c71217cfbe39f2386ae"} Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.784009 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32ad0f4bc955a1c213252ea528d35e7b749f8d4ec4d59c71217cfbe39f2386ae" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.785421 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-23f7-account-create-update-z479t" event={"ID":"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d","Type":"ContainerDied","Data":"49bd09daaa7c514816a284920d5932365e68f9d24f10691ae4161b7348f99fc9"} Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.785460 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49bd09daaa7c514816a284920d5932365e68f9d24f10691ae4161b7348f99fc9" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.785532 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.791939 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-95vbj" event={"ID":"b83b860f-ed6c-46b2-862a-fbda9af7dc89","Type":"ContainerDied","Data":"a7fdcb25a0c057c7c997cd177fd4771ba973fb7faff5931de0e11cf08d037dc6"} Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.791989 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7fdcb25a0c057c7c997cd177fd4771ba973fb7faff5931de0e11cf08d037dc6" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.792082 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.793045 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa06f21-2d35-4d03-86b9-01d9354826da-kube-api-access-9sxnt" (OuterVolumeSpecName: "kube-api-access-9sxnt") pod "1aa06f21-2d35-4d03-86b9-01d9354826da" (UID: "1aa06f21-2d35-4d03-86b9-01d9354826da"). InnerVolumeSpecName "kube-api-access-9sxnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.795811 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71459d1c-2acb-4e15-a30d-09dd0f7f7951-kube-api-access-4lc97" (OuterVolumeSpecName: "kube-api-access-4lc97") pod "71459d1c-2acb-4e15-a30d-09dd0f7f7951" (UID: "71459d1c-2acb-4e15-a30d-09dd0f7f7951"). InnerVolumeSpecName "kube-api-access-4lc97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.870250 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crvxm\" (UniqueName: \"kubernetes.io/projected/b83b860f-ed6c-46b2-862a-fbda9af7dc89-kube-api-access-crvxm\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.870278 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71459d1c-2acb-4e15-a30d-09dd0f7f7951-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.870288 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sxnt\" (UniqueName: \"kubernetes.io/projected/1aa06f21-2d35-4d03-86b9-01d9354826da-kube-api-access-9sxnt\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.870296 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lc97\" (UniqueName: \"kubernetes.io/projected/71459d1c-2acb-4e15-a30d-09dd0f7f7951-kube-api-access-4lc97\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.006321 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pxtss"] Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.086121 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.226161 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2sp5q"] Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.226739 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-2sp5q" podUID="a37db268-4fcb-45a7-a7bf-fae19a514257" containerName="dnsmasq-dns" containerID="cri-o://50431c87d4fda7b6d1207e4343981bd67d4f748124f89954c845f5c4fb0d25f7" gracePeriod=10 Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.830879 4898 generic.go:334] "Generic (PLEG): container finished" podID="74eb351d-364c-4564-8f8b-67ac844a6abc" containerID="58affd3294e9aec78373844bf6912651079de0e76c0d060a1cf7a048a7bc787d" exitCode=0 Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.831253 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pxtss" event={"ID":"74eb351d-364c-4564-8f8b-67ac844a6abc","Type":"ContainerDied","Data":"58affd3294e9aec78373844bf6912651079de0e76c0d060a1cf7a048a7bc787d"} Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.831307 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pxtss" event={"ID":"74eb351d-364c-4564-8f8b-67ac844a6abc","Type":"ContainerStarted","Data":"8d0d979c82cb36af5b7db53315fdb00e1e53cc947a1bb29c77b9196d7df5e3d4"} Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.834096 4898 generic.go:334] "Generic (PLEG): container finished" podID="a37db268-4fcb-45a7-a7bf-fae19a514257" containerID="50431c87d4fda7b6d1207e4343981bd67d4f748124f89954c845f5c4fb0d25f7" exitCode=0 Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.834186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2sp5q" event={"ID":"a37db268-4fcb-45a7-a7bf-fae19a514257","Type":"ContainerDied","Data":"50431c87d4fda7b6d1207e4343981bd67d4f748124f89954c845f5c4fb0d25f7"} Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.849088 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dxpl9" event={"ID":"8fdab36c-41db-4a9c-9cbe-47e1761c6df5","Type":"ContainerStarted","Data":"f8a3e423bebf88995b6d32dd30a81abd86b4e9eab9359f63d75400abe5906505"} Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.936344 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.958034 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-dxpl9" podStartSLOduration=2.621679706 podStartE2EDuration="9.958009074s" podCreationTimestamp="2026-03-13 14:20:20 +0000 UTC" firstStartedPulling="2026-03-13 14:20:21.798091277 +0000 UTC m=+1456.799679516" lastFinishedPulling="2026-03-13 14:20:29.134420645 +0000 UTC m=+1464.136008884" observedRunningTime="2026-03-13 14:20:29.886317173 +0000 UTC m=+1464.887905412" watchObservedRunningTime="2026-03-13 14:20:29.958009074 +0000 UTC m=+1464.959597323" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.003346 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-dns-svc\") pod \"a37db268-4fcb-45a7-a7bf-fae19a514257\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.003446 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw842\" (UniqueName: \"kubernetes.io/projected/a37db268-4fcb-45a7-a7bf-fae19a514257-kube-api-access-bw842\") pod \"a37db268-4fcb-45a7-a7bf-fae19a514257\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.003545 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-config\") pod \"a37db268-4fcb-45a7-a7bf-fae19a514257\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.003665 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-sb\") pod \"a37db268-4fcb-45a7-a7bf-fae19a514257\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.003701 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-nb\") pod \"a37db268-4fcb-45a7-a7bf-fae19a514257\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.008509 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a37db268-4fcb-45a7-a7bf-fae19a514257-kube-api-access-bw842" (OuterVolumeSpecName: "kube-api-access-bw842") pod "a37db268-4fcb-45a7-a7bf-fae19a514257" (UID: "a37db268-4fcb-45a7-a7bf-fae19a514257"). InnerVolumeSpecName "kube-api-access-bw842". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.061629 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-config" (OuterVolumeSpecName: "config") pod "a37db268-4fcb-45a7-a7bf-fae19a514257" (UID: "a37db268-4fcb-45a7-a7bf-fae19a514257"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.072174 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a37db268-4fcb-45a7-a7bf-fae19a514257" (UID: "a37db268-4fcb-45a7-a7bf-fae19a514257"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.072365 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a37db268-4fcb-45a7-a7bf-fae19a514257" (UID: "a37db268-4fcb-45a7-a7bf-fae19a514257"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.081558 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a37db268-4fcb-45a7-a7bf-fae19a514257" (UID: "a37db268-4fcb-45a7-a7bf-fae19a514257"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.105554 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.105581 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.105590 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.105599 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw842\" (UniqueName: \"kubernetes.io/projected/a37db268-4fcb-45a7-a7bf-fae19a514257-kube-api-access-bw842\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.105610 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.872736 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2sp5q" event={"ID":"a37db268-4fcb-45a7-a7bf-fae19a514257","Type":"ContainerDied","Data":"da5d3ec372689f6af3cb9e875471efd89602096089c49f7ef2acb131bc222cb5"} Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.873010 4898 scope.go:117] "RemoveContainer" containerID="50431c87d4fda7b6d1207e4343981bd67d4f748124f89954c845f5c4fb0d25f7" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.872768 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.917373 4898 scope.go:117] "RemoveContainer" containerID="de23e3ccb82eedd2170e1cd3b17cd796af8186141a4e3a6a0df25cf87c1ac689" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.925525 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2sp5q"] Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.937915 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2sp5q"] Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.391638 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.429833 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74eb351d-364c-4564-8f8b-67ac844a6abc-operator-scripts\") pod \"74eb351d-364c-4564-8f8b-67ac844a6abc\" (UID: \"74eb351d-364c-4564-8f8b-67ac844a6abc\") " Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.430077 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4tzj\" (UniqueName: \"kubernetes.io/projected/74eb351d-364c-4564-8f8b-67ac844a6abc-kube-api-access-f4tzj\") pod \"74eb351d-364c-4564-8f8b-67ac844a6abc\" (UID: \"74eb351d-364c-4564-8f8b-67ac844a6abc\") " Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.430606 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74eb351d-364c-4564-8f8b-67ac844a6abc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74eb351d-364c-4564-8f8b-67ac844a6abc" (UID: "74eb351d-364c-4564-8f8b-67ac844a6abc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.451356 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74eb351d-364c-4564-8f8b-67ac844a6abc-kube-api-access-f4tzj" (OuterVolumeSpecName: "kube-api-access-f4tzj") pod "74eb351d-364c-4564-8f8b-67ac844a6abc" (UID: "74eb351d-364c-4564-8f8b-67ac844a6abc"). InnerVolumeSpecName "kube-api-access-f4tzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.531956 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4tzj\" (UniqueName: \"kubernetes.io/projected/74eb351d-364c-4564-8f8b-67ac844a6abc-kube-api-access-f4tzj\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.531990 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74eb351d-364c-4564-8f8b-67ac844a6abc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.753994 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a37db268-4fcb-45a7-a7bf-fae19a514257" path="/var/lib/kubelet/pods/a37db268-4fcb-45a7-a7bf-fae19a514257/volumes" Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.883360 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pxtss" event={"ID":"74eb351d-364c-4564-8f8b-67ac844a6abc","Type":"ContainerDied","Data":"8d0d979c82cb36af5b7db53315fdb00e1e53cc947a1bb29c77b9196d7df5e3d4"} Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.883726 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d0d979c82cb36af5b7db53315fdb00e1e53cc947a1bb29c77b9196d7df5e3d4" Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.883399 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.888191 4898 generic.go:334] "Generic (PLEG): container finished" podID="d555bd54-f4d5-4b06-9517-32b4fe687f4b" containerID="3c90dc434ae18a1a3a436a05040e009b821f261e42f9834113cbf08fabe109e3" exitCode=0 Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.888229 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d555bd54-f4d5-4b06-9517-32b4fe687f4b","Type":"ContainerDied","Data":"3c90dc434ae18a1a3a436a05040e009b821f261e42f9834113cbf08fabe109e3"} Mar 13 14:20:32 crc kubenswrapper[4898]: I0313 14:20:32.908029 4898 generic.go:334] "Generic (PLEG): container finished" podID="8fdab36c-41db-4a9c-9cbe-47e1761c6df5" containerID="f8a3e423bebf88995b6d32dd30a81abd86b4e9eab9359f63d75400abe5906505" exitCode=0 Mar 13 14:20:32 crc kubenswrapper[4898]: I0313 14:20:32.908066 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dxpl9" event={"ID":"8fdab36c-41db-4a9c-9cbe-47e1761c6df5","Type":"ContainerDied","Data":"f8a3e423bebf88995b6d32dd30a81abd86b4e9eab9359f63d75400abe5906505"} Mar 13 14:20:32 crc kubenswrapper[4898]: I0313 14:20:32.911067 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d555bd54-f4d5-4b06-9517-32b4fe687f4b","Type":"ContainerStarted","Data":"1ca40490280f990739faac6ee40000e80b9bd47d5063a8c6cda8773de600017d"} Mar 13 14:20:33 crc kubenswrapper[4898]: I0313 14:20:33.159968 4898 scope.go:117] "RemoveContainer" containerID="ad86fe4efa1fa3496cbed8d6aa93dada393bba49f5fe2d8062f7e0508875ea38" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.341108 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.516361 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd476\" (UniqueName: \"kubernetes.io/projected/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-kube-api-access-bd476\") pod \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.516483 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-combined-ca-bundle\") pod \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.516658 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-config-data\") pod \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.525004 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-kube-api-access-bd476" (OuterVolumeSpecName: "kube-api-access-bd476") pod "8fdab36c-41db-4a9c-9cbe-47e1761c6df5" (UID: "8fdab36c-41db-4a9c-9cbe-47e1761c6df5"). InnerVolumeSpecName "kube-api-access-bd476". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.547531 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fdab36c-41db-4a9c-9cbe-47e1761c6df5" (UID: "8fdab36c-41db-4a9c-9cbe-47e1761c6df5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.582375 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-config-data" (OuterVolumeSpecName: "config-data") pod "8fdab36c-41db-4a9c-9cbe-47e1761c6df5" (UID: "8fdab36c-41db-4a9c-9cbe-47e1761c6df5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.619632 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd476\" (UniqueName: \"kubernetes.io/projected/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-kube-api-access-bd476\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.619671 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.619686 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.938666 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dxpl9" event={"ID":"8fdab36c-41db-4a9c-9cbe-47e1761c6df5","Type":"ContainerDied","Data":"4a05de9aa49849a88ec18bd7a217b2e7749103f27e3a95daf9185dac7265effb"} Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.938706 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a05de9aa49849a88ec18bd7a217b2e7749103f27e3a95daf9185dac7265effb" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.938717 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.269990 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mc8t9"] Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.270808 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83b860f-ed6c-46b2-862a-fbda9af7dc89" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.270833 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83b860f-ed6c-46b2-862a-fbda9af7dc89" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.270844 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fdab36c-41db-4a9c-9cbe-47e1761c6df5" containerName="keystone-db-sync" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.270851 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fdab36c-41db-4a9c-9cbe-47e1761c6df5" containerName="keystone-db-sync" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.270864 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8a8516c-5aee-4eae-a59b-498f97c1b92b" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.270873 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a8516c-5aee-4eae-a59b-498f97c1b92b" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.270890 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a37db268-4fcb-45a7-a7bf-fae19a514257" containerName="init" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.270914 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a37db268-4fcb-45a7-a7bf-fae19a514257" containerName="init" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.270933 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea88065-1eff-42e2-809a-443c15bda0ac" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.270941 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea88065-1eff-42e2-809a-443c15bda0ac" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.270952 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71459d1c-2acb-4e15-a30d-09dd0f7f7951" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.270960 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="71459d1c-2acb-4e15-a30d-09dd0f7f7951" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.270978 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.270986 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.270997 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74eb351d-364c-4564-8f8b-67ac844a6abc" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271004 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="74eb351d-364c-4564-8f8b-67ac844a6abc" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.271026 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe3416e-f08a-43c9-8e12-a89c1e849208" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271034 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe3416e-f08a-43c9-8e12-a89c1e849208" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.271053 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b04d3edd-a550-465a-9ef2-2cbea4126ceb" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271062 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b04d3edd-a550-465a-9ef2-2cbea4126ceb" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.271070 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a37db268-4fcb-45a7-a7bf-fae19a514257" containerName="dnsmasq-dns" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271078 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a37db268-4fcb-45a7-a7bf-fae19a514257" containerName="dnsmasq-dns" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.271093 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa06f21-2d35-4d03-86b9-01d9354826da" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271102 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa06f21-2d35-4d03-86b9-01d9354826da" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271336 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a37db268-4fcb-45a7-a7bf-fae19a514257" containerName="dnsmasq-dns" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271355 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa06f21-2d35-4d03-86b9-01d9354826da" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271372 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271383 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b04d3edd-a550-465a-9ef2-2cbea4126ceb" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271393 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bea88065-1eff-42e2-809a-443c15bda0ac" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271410 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83b860f-ed6c-46b2-862a-fbda9af7dc89" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271422 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="71459d1c-2acb-4e15-a30d-09dd0f7f7951" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271438 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe3416e-f08a-43c9-8e12-a89c1e849208" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271451 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="74eb351d-364c-4564-8f8b-67ac844a6abc" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271462 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8a8516c-5aee-4eae-a59b-498f97c1b92b" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271473 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fdab36c-41db-4a9c-9cbe-47e1761c6df5" containerName="keystone-db-sync" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.272364 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.279449 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.279541 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.279653 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tdc5n" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.279711 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.285201 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mc8t9"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.286976 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.310644 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jcdtp"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.322031 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.334395 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jcdtp"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.398513 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-zgt75"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.406936 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.411882 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-d5fsz" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.411920 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.418934 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-zgt75"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442647 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-combined-ca-bundle\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442708 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh9wk\" (UniqueName: \"kubernetes.io/projected/ed76acfb-bb3b-47de-ae85-23de2e792e7b-kube-api-access-bh9wk\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442734 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-credential-keys\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442752 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-config-data\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442769 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-fernet-keys\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442813 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-svc\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442847 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442873 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442887 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442923 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd4hk\" (UniqueName: \"kubernetes.io/projected/10787742-bffc-4545-95cc-8f0354246d7c-kube-api-access-sd4hk\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442949 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-scripts\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442979 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-config\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545238 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh9wk\" (UniqueName: \"kubernetes.io/projected/ed76acfb-bb3b-47de-ae85-23de2e792e7b-kube-api-access-bh9wk\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545289 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-credential-keys\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545314 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-config-data\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545329 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-fernet-keys\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545357 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28mft\" (UniqueName: \"kubernetes.io/projected/84a7fd24-4320-4c0e-8ded-0d455252a549-kube-api-access-28mft\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545401 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-svc\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545437 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545462 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545478 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545504 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd4hk\" (UniqueName: \"kubernetes.io/projected/10787742-bffc-4545-95cc-8f0354246d7c-kube-api-access-sd4hk\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545531 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-scripts\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545559 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-config\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545630 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-config-data\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545665 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-combined-ca-bundle\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545686 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-combined-ca-bundle\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.546622 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.547343 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-svc\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.547780 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.548045 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.548331 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-config\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.553797 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-credential-keys\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.560465 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-fernet-keys\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.570542 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-combined-ca-bundle\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.580239 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-scripts\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.580753 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-config-data\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.606170 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh9wk\" (UniqueName: \"kubernetes.io/projected/ed76acfb-bb3b-47de-ae85-23de2e792e7b-kube-api-access-bh9wk\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.622030 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.623229 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd4hk\" (UniqueName: \"kubernetes.io/projected/10787742-bffc-4545-95cc-8f0354246d7c-kube-api-access-sd4hk\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.654396 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.655810 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-config-data\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.655857 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-combined-ca-bundle\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.655917 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28mft\" (UniqueName: \"kubernetes.io/projected/84a7fd24-4320-4c0e-8ded-0d455252a549-kube-api-access-28mft\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.664347 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-ztp6c"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.666108 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.666733 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-combined-ca-bundle\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.692467 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-config-data\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.701931 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-dcn2n" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.702349 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.702474 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.722993 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28mft\" (UniqueName: \"kubernetes.io/projected/84a7fd24-4320-4c0e-8ded-0d455252a549-kube-api-access-28mft\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.730965 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-hm77q"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.732313 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.742270 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.742451 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-db2fv" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.742611 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.804299 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.835593 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ztp6c"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.835795 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hm77q"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.922887 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/193b05da-acb9-4512-a2ae-6c03450e6f05-etc-machine-id\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.923017 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-db-sync-config-data\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.923137 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2nsj\" (UniqueName: \"kubernetes.io/projected/664deedc-3946-4205-98ad-21759d35d952-kube-api-access-s2nsj\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.923161 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82hs7\" (UniqueName: \"kubernetes.io/projected/193b05da-acb9-4512-a2ae-6c03450e6f05-kube-api-access-82hs7\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.923217 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-config\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.923334 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-config-data\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.923399 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-combined-ca-bundle\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.923585 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-scripts\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.923726 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-combined-ca-bundle\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.015221 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-xq6ss"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.023833 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.030758 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.030950 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rkpbr" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.034432 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-db-sync-config-data\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.034549 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2nsj\" (UniqueName: \"kubernetes.io/projected/664deedc-3946-4205-98ad-21759d35d952-kube-api-access-s2nsj\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.034571 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82hs7\" (UniqueName: \"kubernetes.io/projected/193b05da-acb9-4512-a2ae-6c03450e6f05-kube-api-access-82hs7\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.034623 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-config\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.034736 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-config-data\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.034792 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-combined-ca-bundle\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.034818 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-scripts\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.034938 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-combined-ca-bundle\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.035078 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/193b05da-acb9-4512-a2ae-6c03450e6f05-etc-machine-id\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.035226 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/193b05da-acb9-4512-a2ae-6c03450e6f05-etc-machine-id\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.071605 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-combined-ca-bundle\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.079637 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-db-sync-config-data\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.080230 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-config\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.101388 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2nsj\" (UniqueName: \"kubernetes.io/projected/664deedc-3946-4205-98ad-21759d35d952-kube-api-access-s2nsj\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.110284 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-combined-ca-bundle\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.113337 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82hs7\" (UniqueName: \"kubernetes.io/projected/193b05da-acb9-4512-a2ae-6c03450e6f05-kube-api-access-82hs7\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.114761 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-config-data\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.116663 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-scripts\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.127949 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d555bd54-f4d5-4b06-9517-32b4fe687f4b","Type":"ContainerStarted","Data":"578089192ca9537b8bb073d8f2c027027dc4036f5b750f9a009d81f2e4fe812e"} Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.137491 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xq6ss"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.137585 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-db-sync-config-data\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.137647 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-combined-ca-bundle\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.137712 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwfsv\" (UniqueName: \"kubernetes.io/projected/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-kube-api-access-cwfsv\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.155771 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-dddqm"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.157250 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.158800 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.160066 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qr6vd" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.160289 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.170978 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jcdtp"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.183188 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dddqm"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.195836 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fbs4f"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.197856 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.205239 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fbs4f"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.239173 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-config-data\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.239215 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51a3e0c5-0084-4216-a162-3614eafcc162-logs\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.239241 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-combined-ca-bundle\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.239307 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwfsv\" (UniqueName: \"kubernetes.io/projected/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-kube-api-access-cwfsv\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.239807 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st255\" (UniqueName: \"kubernetes.io/projected/51a3e0c5-0084-4216-a162-3614eafcc162-kube-api-access-st255\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.239889 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-scripts\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.240068 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-db-sync-config-data\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.240151 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-combined-ca-bundle\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.257332 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-db-sync-config-data\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.258780 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-combined-ca-bundle\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.269085 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwfsv\" (UniqueName: \"kubernetes.io/projected/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-kube-api-access-cwfsv\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.272881 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.287945 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.342922 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51a3e0c5-0084-4216-a162-3614eafcc162-logs\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.342967 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-config-data\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.342993 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-combined-ca-bundle\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343018 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343076 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343123 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st255\" (UniqueName: \"kubernetes.io/projected/51a3e0c5-0084-4216-a162-3614eafcc162-kube-api-access-st255\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343146 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343169 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqg7r\" (UniqueName: \"kubernetes.io/projected/04642207-fab0-47bf-9ac4-030bbe91b4f0-kube-api-access-cqg7r\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343194 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-scripts\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343219 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343262 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-config\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343343 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51a3e0c5-0084-4216-a162-3614eafcc162-logs\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.346327 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-scripts\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.347923 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-combined-ca-bundle\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.368709 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.369470 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st255\" (UniqueName: \"kubernetes.io/projected/51a3e0c5-0084-4216-a162-3614eafcc162-kube-api-access-st255\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.369659 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-config-data\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.418957 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.420730 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.429962 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-t4f8x" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.430345 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.430479 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.430590 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.445150 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.445222 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-config\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.445312 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.445373 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.446556 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.446608 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqg7r\" (UniqueName: \"kubernetes.io/projected/04642207-fab0-47bf-9ac4-030bbe91b4f0-kube-api-access-cqg7r\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.447861 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.448382 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-config\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.448728 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.448965 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.449940 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.460742 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.503870 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.512874 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.520008 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqg7r\" (UniqueName: \"kubernetes.io/projected/04642207-fab0-47bf-9ac4-030bbe91b4f0-kube-api-access-cqg7r\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.521983 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.525126 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.525249 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.527526 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.527794 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.550248 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.550304 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-scripts\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.550364 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpxqx\" (UniqueName: \"kubernetes.io/projected/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-kube-api-access-kpxqx\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.550397 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-config-data\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.550429 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.550493 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-logs\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.550539 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.550566 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.629890 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mc8t9"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.654505 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-logs\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.654731 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.654837 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.655442 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.655442 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.655543 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.655651 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.655661 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-logs\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.655914 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656015 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656051 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656113 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-scripts\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656286 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656376 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpxqx\" (UniqueName: \"kubernetes.io/projected/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-kube-api-access-kpxqx\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656794 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656826 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg4fw\" (UniqueName: \"kubernetes.io/projected/7f9ea679-73ec-46c5-b3b9-25d63398eb35-kube-api-access-mg4fw\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656869 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-config-data\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656957 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.661447 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-scripts\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.661456 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.662773 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.664595 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.664628 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1810628263decbcb8d9790a46f0a2a80fe37ecdd6e2a4c05137bd112c0de5f67/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.674054 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-config-data\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.687067 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpxqx\" (UniqueName: \"kubernetes.io/projected/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-kube-api-access-kpxqx\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.768680 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.768753 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.768788 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.768923 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.768985 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.769074 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.769124 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.769139 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg4fw\" (UniqueName: \"kubernetes.io/projected/7f9ea679-73ec-46c5-b3b9-25d63398eb35-kube-api-access-mg4fw\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.769853 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.770824 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.779623 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.780741 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.782524 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.782563 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e32bfaa798492a506ddfd6dd81603c6b252f4a286c98ba8256226389647f45c3/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.800987 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg4fw\" (UniqueName: \"kubernetes.io/projected/7f9ea679-73ec-46c5-b3b9-25d63398eb35-kube-api-access-mg4fw\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.801257 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.801637 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.803727 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.807065 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.807249 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.832030 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.835471 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.837825 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.872019 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.872111 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-run-httpd\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.872206 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-scripts\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.872242 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffnxt\" (UniqueName: \"kubernetes.io/projected/247749ae-204b-4e9c-ad1c-f5d924b6f211-kube-api-access-ffnxt\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.872259 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.872274 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-log-httpd\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.872349 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-config-data\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.886627 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.904529 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.973832 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-config-data\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.974285 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.974332 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-run-httpd\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.974390 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-scripts\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.974408 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffnxt\" (UniqueName: \"kubernetes.io/projected/247749ae-204b-4e9c-ad1c-f5d924b6f211-kube-api-access-ffnxt\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.974460 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-log-httpd\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.974501 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.975768 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-log-httpd\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.975997 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-run-httpd\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.977734 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-config-data\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.980813 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.986994 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.992005 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffnxt\" (UniqueName: \"kubernetes.io/projected/247749ae-204b-4e9c-ad1c-f5d924b6f211-kube-api-access-ffnxt\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.995393 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-scripts\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.059056 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jcdtp"] Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.069525 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.098265 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-zgt75"] Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.162253 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.207480 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" event={"ID":"10787742-bffc-4545-95cc-8f0354246d7c","Type":"ContainerStarted","Data":"f5c205239fce65effb688de2367793807b3add254d60c501123275b62220d66b"} Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.229934 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zgt75" event={"ID":"84a7fd24-4320-4c0e-8ded-0d455252a549","Type":"ContainerStarted","Data":"ff8a73d5234eb1ed4542baaf925a5bae9eff511012c73d58ac5c330a7c07d613"} Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.237144 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mc8t9" event={"ID":"ed76acfb-bb3b-47de-ae85-23de2e792e7b","Type":"ContainerStarted","Data":"862d777be88dbcc89854ce3dd00093f5671b014a3c866a71f6f944e62cffdd2c"} Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.240005 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.241580 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d555bd54-f4d5-4b06-9517-32b4fe687f4b","Type":"ContainerStarted","Data":"110d56f13170c1319b21d99b74fd32c1fea939cfc51a2e23b13c200d65b1e7fc"} Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.263406 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ztp6c"] Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.289929 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.289908257 podStartE2EDuration="18.289908257s" podCreationTimestamp="2026-03-13 14:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:37.271003536 +0000 UTC m=+1472.272591805" watchObservedRunningTime="2026-03-13 14:20:37.289908257 +0000 UTC m=+1472.291496496" Mar 13 14:20:37 crc kubenswrapper[4898]: W0313 14:20:37.680042 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51a3e0c5_0084_4216_a162_3614eafcc162.slice/crio-4656230f194edb795af2740a5c0bb83bbde4d4a8b2fd3cee4accb6bdc572ea5d WatchSource:0}: Error finding container 4656230f194edb795af2740a5c0bb83bbde4d4a8b2fd3cee4accb6bdc572ea5d: Status 404 returned error can't find the container with id 4656230f194edb795af2740a5c0bb83bbde4d4a8b2fd3cee4accb6bdc572ea5d Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.686720 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dddqm"] Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.734208 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xq6ss"] Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.760875 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hm77q"] Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.896955 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fbs4f"] Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.897024 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.905518 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.166427 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:20:38 crc kubenswrapper[4898]: W0313 14:20:38.220281 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda92e3d93_9b4b_4e81_84dc_5cf7aa23f96c.slice/crio-52d408fc697e9533734dc32479d4f8978a1ff97d35d66aafb1e183a3827a836e WatchSource:0}: Error finding container 52d408fc697e9533734dc32479d4f8978a1ff97d35d66aafb1e183a3827a836e: Status 404 returned error can't find the container with id 52d408fc697e9533734dc32479d4f8978a1ff97d35d66aafb1e183a3827a836e Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.308784 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.309744 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ztp6c" event={"ID":"193b05da-acb9-4512-a2ae-6c03450e6f05","Type":"ContainerStarted","Data":"bfe3b6cf0e5928312929ea860aeb7b7f643553f3479a3beac4f364f3ff4502ae"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.311451 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dddqm" event={"ID":"51a3e0c5-0084-4216-a162-3614eafcc162","Type":"ContainerStarted","Data":"4656230f194edb795af2740a5c0bb83bbde4d4a8b2fd3cee4accb6bdc572ea5d"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.345213 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" event={"ID":"04642207-fab0-47bf-9ac4-030bbe91b4f0","Type":"ContainerStarted","Data":"a566cd3c2104d4cf2d4de94c8ef5556830e27cbf9385397fa0c4b4a48c1b947c"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.345256 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" event={"ID":"04642207-fab0-47bf-9ac4-030bbe91b4f0","Type":"ContainerStarted","Data":"1f8aa6f56c769252ca6f9fa29c34832b03b0bd31e4320e8506b18e27d01c86a7"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.348658 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.398156 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hm77q" event={"ID":"664deedc-3946-4205-98ad-21759d35d952","Type":"ContainerStarted","Data":"6d9f47030df3927c7cd3d661b2414ebcd6b45bbd0a337a5460ac32062ae1c8cc"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.472774 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.481162 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c","Type":"ContainerStarted","Data":"52d408fc697e9533734dc32479d4f8978a1ff97d35d66aafb1e183a3827a836e"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.510881 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-hm77q" podStartSLOduration=3.5108612519999998 podStartE2EDuration="3.510861252s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:38.458297308 +0000 UTC m=+1473.459885547" watchObservedRunningTime="2026-03-13 14:20:38.510861252 +0000 UTC m=+1473.512449491" Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.599091 4898 generic.go:334] "Generic (PLEG): container finished" podID="10787742-bffc-4545-95cc-8f0354246d7c" containerID="f3696d82d52d862da7bc6bea3b377d64088527f42e2ee331d8b5ccc92226da57" exitCode=0 Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.599267 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" event={"ID":"10787742-bffc-4545-95cc-8f0354246d7c","Type":"ContainerDied","Data":"f3696d82d52d862da7bc6bea3b377d64088527f42e2ee331d8b5ccc92226da57"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.613868 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xq6ss" event={"ID":"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83","Type":"ContainerStarted","Data":"bafefd0ee86b1967f73e5ee3d1256b9f0f1d84430cc94cf6628cfc6e827e8aad"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.641777 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mc8t9" event={"ID":"ed76acfb-bb3b-47de-ae85-23de2e792e7b","Type":"ContainerStarted","Data":"e1e9eae3fcad1899b5e4ca6e0e525d3dd9661ab59290e1ba7355e6176cbc69f0"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.666443 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mc8t9" podStartSLOduration=3.666420001 podStartE2EDuration="3.666420001s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:38.661359249 +0000 UTC m=+1473.662947488" watchObservedRunningTime="2026-03-13 14:20:38.666420001 +0000 UTC m=+1473.668008240" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.359987 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.493622 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd4hk\" (UniqueName: \"kubernetes.io/projected/10787742-bffc-4545-95cc-8f0354246d7c-kube-api-access-sd4hk\") pod \"10787742-bffc-4545-95cc-8f0354246d7c\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.494203 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-nb\") pod \"10787742-bffc-4545-95cc-8f0354246d7c\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.494244 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-svc\") pod \"10787742-bffc-4545-95cc-8f0354246d7c\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.494284 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-swift-storage-0\") pod \"10787742-bffc-4545-95cc-8f0354246d7c\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.494423 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-config\") pod \"10787742-bffc-4545-95cc-8f0354246d7c\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.494568 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-sb\") pod \"10787742-bffc-4545-95cc-8f0354246d7c\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.500447 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10787742-bffc-4545-95cc-8f0354246d7c-kube-api-access-sd4hk" (OuterVolumeSpecName: "kube-api-access-sd4hk") pod "10787742-bffc-4545-95cc-8f0354246d7c" (UID: "10787742-bffc-4545-95cc-8f0354246d7c"). InnerVolumeSpecName "kube-api-access-sd4hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.538485 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "10787742-bffc-4545-95cc-8f0354246d7c" (UID: "10787742-bffc-4545-95cc-8f0354246d7c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.550616 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "10787742-bffc-4545-95cc-8f0354246d7c" (UID: "10787742-bffc-4545-95cc-8f0354246d7c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.583678 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "10787742-bffc-4545-95cc-8f0354246d7c" (UID: "10787742-bffc-4545-95cc-8f0354246d7c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.598847 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-config" (OuterVolumeSpecName: "config") pod "10787742-bffc-4545-95cc-8f0354246d7c" (UID: "10787742-bffc-4545-95cc-8f0354246d7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.598942 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-config\") pod \"10787742-bffc-4545-95cc-8f0354246d7c\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.599876 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd4hk\" (UniqueName: \"kubernetes.io/projected/10787742-bffc-4545-95cc-8f0354246d7c-kube-api-access-sd4hk\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.599889 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.599910 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.599919 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:39 crc kubenswrapper[4898]: W0313 14:20:39.600123 4898 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/10787742-bffc-4545-95cc-8f0354246d7c/volumes/kubernetes.io~configmap/config Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.600135 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-config" (OuterVolumeSpecName: "config") pod "10787742-bffc-4545-95cc-8f0354246d7c" (UID: "10787742-bffc-4545-95cc-8f0354246d7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.605224 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "10787742-bffc-4545-95cc-8f0354246d7c" (UID: "10787742-bffc-4545-95cc-8f0354246d7c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.686195 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c","Type":"ContainerStarted","Data":"0df5bf75681ada8dd8a40c022de81b8eec9f36333dccaa2326ce28596ea32dd8"} Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.688223 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"247749ae-204b-4e9c-ad1c-f5d924b6f211","Type":"ContainerStarted","Data":"5fd3c6beb630c0a00d78ffbb0eaf96e2717f61918bd632f3618c99bd721d1714"} Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.690699 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" event={"ID":"10787742-bffc-4545-95cc-8f0354246d7c","Type":"ContainerDied","Data":"f5c205239fce65effb688de2367793807b3add254d60c501123275b62220d66b"} Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.690733 4898 scope.go:117] "RemoveContainer" containerID="f3696d82d52d862da7bc6bea3b377d64088527f42e2ee331d8b5ccc92226da57" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.690847 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.702413 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.702448 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.711425 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f9ea679-73ec-46c5-b3b9-25d63398eb35","Type":"ContainerStarted","Data":"55281eef0a29c186bce67ca4ff84ee09f28c634f48016b0a99b76a64d0ade9af"} Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.729636 4898 generic.go:334] "Generic (PLEG): container finished" podID="04642207-fab0-47bf-9ac4-030bbe91b4f0" containerID="a566cd3c2104d4cf2d4de94c8ef5556830e27cbf9385397fa0c4b4a48c1b947c" exitCode=0 Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.729706 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" event={"ID":"04642207-fab0-47bf-9ac4-030bbe91b4f0","Type":"ContainerDied","Data":"a566cd3c2104d4cf2d4de94c8ef5556830e27cbf9385397fa0c4b4a48c1b947c"} Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.729763 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" event={"ID":"04642207-fab0-47bf-9ac4-030bbe91b4f0","Type":"ContainerStarted","Data":"b87af33e145b04333fd2674b6cd7ae8ffea23dd169547cec058f2a13473c72c5"} Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.730917 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.768533 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" podStartSLOduration=4.7685112610000004 podStartE2EDuration="4.768511261s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:39.751150531 +0000 UTC m=+1474.752738770" watchObservedRunningTime="2026-03-13 14:20:39.768511261 +0000 UTC m=+1474.770099500" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.784162 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hm77q" event={"ID":"664deedc-3946-4205-98ad-21759d35d952","Type":"ContainerStarted","Data":"d1d45dfadc513f7c3e24c556ad6a45d1d7e050a0da971045c5c1181fc58bc3d2"} Mar 13 14:20:40 crc kubenswrapper[4898]: I0313 14:20:40.761171 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f9ea679-73ec-46c5-b3b9-25d63398eb35","Type":"ContainerStarted","Data":"5d3511c79cb347abef153850ae05dbbda4e475dc7ccff6ee262eef01f5d5c250"} Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.002450 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.788328 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f9ea679-73ec-46c5-b3b9-25d63398eb35","Type":"ContainerStarted","Data":"9e586b7dba7e52cd9e5256f1170f84b5de3fe664830b8456825bc0fe70ad9fed"} Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.788555 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerName="glance-httpd" containerID="cri-o://9e586b7dba7e52cd9e5256f1170f84b5de3fe664830b8456825bc0fe70ad9fed" gracePeriod=30 Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.788485 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerName="glance-log" containerID="cri-o://5d3511c79cb347abef153850ae05dbbda4e475dc7ccff6ee262eef01f5d5c250" gracePeriod=30 Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.792673 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c","Type":"ContainerStarted","Data":"dfc29ad43e0c709dcb1ef1785b269ec83a25e185502f0001b9412c1b48560e60"} Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.792782 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerName="glance-log" containerID="cri-o://0df5bf75681ada8dd8a40c022de81b8eec9f36333dccaa2326ce28596ea32dd8" gracePeriod=30 Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.792800 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerName="glance-httpd" containerID="cri-o://dfc29ad43e0c709dcb1ef1785b269ec83a25e185502f0001b9412c1b48560e60" gracePeriod=30 Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.814151 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.814127044 podStartE2EDuration="6.814127044s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:41.81321551 +0000 UTC m=+1476.814803749" watchObservedRunningTime="2026-03-13 14:20:41.814127044 +0000 UTC m=+1476.815715283" Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.858436 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.858417114 podStartE2EDuration="6.858417114s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:41.847737497 +0000 UTC m=+1476.849325756" watchObservedRunningTime="2026-03-13 14:20:41.858417114 +0000 UTC m=+1476.860005353" Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.805800 4898 generic.go:334] "Generic (PLEG): container finished" podID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerID="9e586b7dba7e52cd9e5256f1170f84b5de3fe664830b8456825bc0fe70ad9fed" exitCode=0 Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.807404 4898 generic.go:334] "Generic (PLEG): container finished" podID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerID="5d3511c79cb347abef153850ae05dbbda4e475dc7ccff6ee262eef01f5d5c250" exitCode=143 Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.805933 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f9ea679-73ec-46c5-b3b9-25d63398eb35","Type":"ContainerDied","Data":"9e586b7dba7e52cd9e5256f1170f84b5de3fe664830b8456825bc0fe70ad9fed"} Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.807637 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f9ea679-73ec-46c5-b3b9-25d63398eb35","Type":"ContainerDied","Data":"5d3511c79cb347abef153850ae05dbbda4e475dc7ccff6ee262eef01f5d5c250"} Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.811112 4898 generic.go:334] "Generic (PLEG): container finished" podID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerID="dfc29ad43e0c709dcb1ef1785b269ec83a25e185502f0001b9412c1b48560e60" exitCode=0 Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.811139 4898 generic.go:334] "Generic (PLEG): container finished" podID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerID="0df5bf75681ada8dd8a40c022de81b8eec9f36333dccaa2326ce28596ea32dd8" exitCode=143 Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.811167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c","Type":"ContainerDied","Data":"dfc29ad43e0c709dcb1ef1785b269ec83a25e185502f0001b9412c1b48560e60"} Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.811205 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c","Type":"ContainerDied","Data":"0df5bf75681ada8dd8a40c022de81b8eec9f36333dccaa2326ce28596ea32dd8"} Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.813122 4898 generic.go:334] "Generic (PLEG): container finished" podID="ed76acfb-bb3b-47de-ae85-23de2e792e7b" containerID="e1e9eae3fcad1899b5e4ca6e0e525d3dd9661ab59290e1ba7355e6176cbc69f0" exitCode=0 Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.813150 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mc8t9" event={"ID":"ed76acfb-bb3b-47de-ae85-23de2e792e7b","Type":"ContainerDied","Data":"e1e9eae3fcad1899b5e4ca6e0e525d3dd9661ab59290e1ba7355e6176cbc69f0"} Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.755195 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.781551 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.783099 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.855633 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh9wk\" (UniqueName: \"kubernetes.io/projected/ed76acfb-bb3b-47de-ae85-23de2e792e7b-kube-api-access-bh9wk\") pod \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.855817 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-combined-ca-bundle\") pod \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.855881 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-scripts\") pod \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.855972 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-fernet-keys\") pod \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.856030 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-credential-keys\") pod \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.856098 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-config-data\") pod \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.866413 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed76acfb-bb3b-47de-ae85-23de2e792e7b-kube-api-access-bh9wk" (OuterVolumeSpecName: "kube-api-access-bh9wk") pod "ed76acfb-bb3b-47de-ae85-23de2e792e7b" (UID: "ed76acfb-bb3b-47de-ae85-23de2e792e7b"). InnerVolumeSpecName "kube-api-access-bh9wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.868187 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-scripts" (OuterVolumeSpecName: "scripts") pod "ed76acfb-bb3b-47de-ae85-23de2e792e7b" (UID: "ed76acfb-bb3b-47de-ae85-23de2e792e7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.870706 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ed76acfb-bb3b-47de-ae85-23de2e792e7b" (UID: "ed76acfb-bb3b-47de-ae85-23de2e792e7b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.873749 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ed76acfb-bb3b-47de-ae85-23de2e792e7b" (UID: "ed76acfb-bb3b-47de-ae85-23de2e792e7b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.874324 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mc8t9" event={"ID":"ed76acfb-bb3b-47de-ae85-23de2e792e7b","Type":"ContainerDied","Data":"862d777be88dbcc89854ce3dd00093f5671b014a3c866a71f6f944e62cffdd2c"} Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.874362 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="862d777be88dbcc89854ce3dd00093f5671b014a3c866a71f6f944e62cffdd2c" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.874426 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.885791 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f9ea679-73ec-46c5-b3b9-25d63398eb35","Type":"ContainerDied","Data":"55281eef0a29c186bce67ca4ff84ee09f28c634f48016b0a99b76a64d0ade9af"} Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.885840 4898 scope.go:117] "RemoveContainer" containerID="9e586b7dba7e52cd9e5256f1170f84b5de3fe664830b8456825bc0fe70ad9fed" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.885957 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.893501 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c","Type":"ContainerDied","Data":"52d408fc697e9533734dc32479d4f8978a1ff97d35d66aafb1e183a3827a836e"} Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.893577 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.920413 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed76acfb-bb3b-47de-ae85-23de2e792e7b" (UID: "ed76acfb-bb3b-47de-ae85-23de2e792e7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.920479 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mc8t9"] Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.950704 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mc8t9"] Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.953210 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-config-data" (OuterVolumeSpecName: "config-data") pod "ed76acfb-bb3b-47de-ae85-23de2e792e7b" (UID: "ed76acfb-bb3b-47de-ae85-23de2e792e7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.958908 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpxqx\" (UniqueName: \"kubernetes.io/projected/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-kube-api-access-kpxqx\") pod \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.958958 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-internal-tls-certs\") pod \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959011 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-httpd-run\") pod \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959047 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-logs\") pod \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959090 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-public-tls-certs\") pod \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959125 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-combined-ca-bundle\") pod \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959154 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-httpd-run\") pod \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959259 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-config-data\") pod \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959280 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-combined-ca-bundle\") pod \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959295 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-scripts\") pod \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959407 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959460 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-logs\") pod \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959516 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959594 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-scripts\") pod \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959633 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg4fw\" (UniqueName: \"kubernetes.io/projected/7f9ea679-73ec-46c5-b3b9-25d63398eb35-kube-api-access-mg4fw\") pod \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959650 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-config-data\") pod \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960060 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960075 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960087 4898 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960096 4898 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960105 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960113 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh9wk\" (UniqueName: \"kubernetes.io/projected/ed76acfb-bb3b-47de-ae85-23de2e792e7b-kube-api-access-bh9wk\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960765 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-logs" (OuterVolumeSpecName: "logs") pod "7f9ea679-73ec-46c5-b3b9-25d63398eb35" (UID: "7f9ea679-73ec-46c5-b3b9-25d63398eb35"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960817 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7f9ea679-73ec-46c5-b3b9-25d63398eb35" (UID: "7f9ea679-73ec-46c5-b3b9-25d63398eb35"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960855 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-logs" (OuterVolumeSpecName: "logs") pod "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" (UID: "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.961016 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" (UID: "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.965663 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-kube-api-access-kpxqx" (OuterVolumeSpecName: "kube-api-access-kpxqx") pod "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" (UID: "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c"). InnerVolumeSpecName "kube-api-access-kpxqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.967613 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-scripts" (OuterVolumeSpecName: "scripts") pod "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" (UID: "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.968786 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-scripts" (OuterVolumeSpecName: "scripts") pod "7f9ea679-73ec-46c5-b3b9-25d63398eb35" (UID: "7f9ea679-73ec-46c5-b3b9-25d63398eb35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.969490 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f9ea679-73ec-46c5-b3b9-25d63398eb35-kube-api-access-mg4fw" (OuterVolumeSpecName: "kube-api-access-mg4fw") pod "7f9ea679-73ec-46c5-b3b9-25d63398eb35" (UID: "7f9ea679-73ec-46c5-b3b9-25d63398eb35"). InnerVolumeSpecName "kube-api-access-mg4fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.996479 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3" (OuterVolumeSpecName: "glance") pod "7f9ea679-73ec-46c5-b3b9-25d63398eb35" (UID: "7f9ea679-73ec-46c5-b3b9-25d63398eb35"). InnerVolumeSpecName "pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.014780 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" (UID: "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.017306 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ljct7"] Mar 13 14:20:45 crc kubenswrapper[4898]: E0313 14:20:45.030338 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed76acfb-bb3b-47de-ae85-23de2e792e7b" containerName="keystone-bootstrap" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030375 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed76acfb-bb3b-47de-ae85-23de2e792e7b" containerName="keystone-bootstrap" Mar 13 14:20:45 crc kubenswrapper[4898]: E0313 14:20:45.030390 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerName="glance-httpd" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030400 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerName="glance-httpd" Mar 13 14:20:45 crc kubenswrapper[4898]: E0313 14:20:45.030414 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10787742-bffc-4545-95cc-8f0354246d7c" containerName="init" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030422 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="10787742-bffc-4545-95cc-8f0354246d7c" containerName="init" Mar 13 14:20:45 crc kubenswrapper[4898]: E0313 14:20:45.030441 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerName="glance-log" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030451 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerName="glance-log" Mar 13 14:20:45 crc kubenswrapper[4898]: E0313 14:20:45.030466 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerName="glance-log" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030473 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerName="glance-log" Mar 13 14:20:45 crc kubenswrapper[4898]: E0313 14:20:45.030487 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerName="glance-httpd" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030494 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerName="glance-httpd" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030871 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed76acfb-bb3b-47de-ae85-23de2e792e7b" containerName="keystone-bootstrap" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030892 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerName="glance-httpd" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030925 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerName="glance-log" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030944 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="10787742-bffc-4545-95cc-8f0354246d7c" containerName="init" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030953 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerName="glance-log" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030977 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerName="glance-httpd" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.031828 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ljct7"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.034884 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.040814 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7" (OuterVolumeSpecName: "glance") pod "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" (UID: "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c"). InnerVolumeSpecName "pvc-17b3b094-1a55-406a-a787-e0abb588e5b7". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063392 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") on node \"crc\" " Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063451 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063466 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg4fw\" (UniqueName: \"kubernetes.io/projected/7f9ea679-73ec-46c5-b3b9-25d63398eb35-kube-api-access-mg4fw\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063481 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpxqx\" (UniqueName: \"kubernetes.io/projected/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-kube-api-access-kpxqx\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063492 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063505 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063535 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063545 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063557 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063577 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") on node \"crc\" " Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063590 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.079476 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7f9ea679-73ec-46c5-b3b9-25d63398eb35" (UID: "7f9ea679-73ec-46c5-b3b9-25d63398eb35"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.095138 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f9ea679-73ec-46c5-b3b9-25d63398eb35" (UID: "7f9ea679-73ec-46c5-b3b9-25d63398eb35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.102223 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.102355 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-17b3b094-1a55-406a-a787-e0abb588e5b7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7") on node "crc" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.115441 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" (UID: "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.120098 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-config-data" (OuterVolumeSpecName: "config-data") pod "7f9ea679-73ec-46c5-b3b9-25d63398eb35" (UID: "7f9ea679-73ec-46c5-b3b9-25d63398eb35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.121947 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.122154 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3") on node "crc" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.140882 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-config-data" (OuterVolumeSpecName: "config-data") pod "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" (UID: "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.164763 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-combined-ca-bundle\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.164801 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-credential-keys\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.164819 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-fernet-keys\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.164995 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-config-data\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165034 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-scripts\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165060 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9s6v\" (UniqueName: \"kubernetes.io/projected/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-kube-api-access-j9s6v\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165143 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165189 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165229 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165243 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165258 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165268 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165279 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.252731 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.267255 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-config-data\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.267362 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-scripts\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.267431 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9s6v\" (UniqueName: \"kubernetes.io/projected/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-kube-api-access-j9s6v\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.267533 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-combined-ca-bundle\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.267582 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-credential-keys\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.267608 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-fernet-keys\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.272884 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-config-data\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.274530 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-fernet-keys\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.279513 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-combined-ca-bundle\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.280004 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-scripts\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.297516 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-credential-keys\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.297654 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.306038 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.324656 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9s6v\" (UniqueName: \"kubernetes.io/projected/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-kube-api-access-j9s6v\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.328287 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.339131 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.341474 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.344785 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-t4f8x" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.345030 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.345292 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.345560 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.352517 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.366165 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.372205 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.374665 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.374879 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.382187 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.436219 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.472805 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-logs\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.472874 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-config-data\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.472922 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.472968 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wpcb\" (UniqueName: \"kubernetes.io/projected/f772c247-f65b-4185-9c75-25d5894ada70-kube-api-access-4wpcb\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473005 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-logs\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473040 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473067 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473103 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473125 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473157 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473188 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473213 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473229 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-scripts\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473245 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9fs8\" (UniqueName: \"kubernetes.io/projected/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-kube-api-access-n9fs8\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473268 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473286 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574558 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-logs\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574644 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-config-data\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574681 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574738 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wpcb\" (UniqueName: \"kubernetes.io/projected/f772c247-f65b-4185-9c75-25d5894ada70-kube-api-access-4wpcb\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574767 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-logs\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574795 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574815 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574843 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574872 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574918 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574948 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574970 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574987 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-scripts\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.575004 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9fs8\" (UniqueName: \"kubernetes.io/projected/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-kube-api-access-n9fs8\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.575025 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.575046 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.575828 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.576409 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-logs\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.577068 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-logs\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.577438 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.580836 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.581521 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.581612 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.581787 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.581936 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.582015 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e32bfaa798492a506ddfd6dd81603c6b252f4a286c98ba8256226389647f45c3/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.582023 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.583025 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.583048 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-config-data\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.583057 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1810628263decbcb8d9790a46f0a2a80fe37ecdd6e2a4c05137bd112c0de5f67/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.583308 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.586865 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-scripts\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.596665 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9fs8\" (UniqueName: \"kubernetes.io/projected/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-kube-api-access-n9fs8\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.601246 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wpcb\" (UniqueName: \"kubernetes.io/projected/f772c247-f65b-4185-9c75-25d5894ada70-kube-api-access-4wpcb\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.626413 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.646116 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.667610 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.696125 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.755080 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" path="/var/lib/kubelet/pods/7f9ea679-73ec-46c5-b3b9-25d63398eb35/volumes" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.756273 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" path="/var/lib/kubelet/pods/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c/volumes" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.756820 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed76acfb-bb3b-47de-ae85-23de2e792e7b" path="/var/lib/kubelet/pods/ed76acfb-bb3b-47de-ae85-23de2e792e7b/volumes" Mar 13 14:20:46 crc kubenswrapper[4898]: I0313 14:20:46.523788 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:46 crc kubenswrapper[4898]: I0313 14:20:46.607680 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-cz6vf"] Mar 13 14:20:46 crc kubenswrapper[4898]: I0313 14:20:46.607977 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="dnsmasq-dns" containerID="cri-o://ce178a8ab6289ecc30ada8030035df3e75b119b24ae01060c42a05e394597ae3" gracePeriod=10 Mar 13 14:20:46 crc kubenswrapper[4898]: I0313 14:20:46.930043 4898 generic.go:334] "Generic (PLEG): container finished" podID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerID="ce178a8ab6289ecc30ada8030035df3e75b119b24ae01060c42a05e394597ae3" exitCode=0 Mar 13 14:20:46 crc kubenswrapper[4898]: I0313 14:20:46.930085 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" event={"ID":"9f1520e0-d7d9-4992-9ca5-1b2e98313d33","Type":"ContainerDied","Data":"ce178a8ab6289ecc30ada8030035df3e75b119b24ae01060c42a05e394597ae3"} Mar 13 14:20:51 crc kubenswrapper[4898]: I0313 14:20:51.002417 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:51 crc kubenswrapper[4898]: I0313 14:20:51.007816 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:52 crc kubenswrapper[4898]: I0313 14:20:52.027603 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:54 crc kubenswrapper[4898]: I0313 14:20:54.086786 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.176:5353: i/o timeout" Mar 13 14:20:56 crc kubenswrapper[4898]: E0313 14:20:56.436067 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 13 14:20:56 crc kubenswrapper[4898]: E0313 14:20:56.436621 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cwfsv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-xq6ss_openstack(ac704482-c7a4-471c-b3c1-d1fdd7e0eb83): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:20:56 crc kubenswrapper[4898]: E0313 14:20:56.437802 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-xq6ss" podUID="ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" Mar 13 14:20:57 crc kubenswrapper[4898]: E0313 14:20:57.096213 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-xq6ss" podUID="ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" Mar 13 14:20:59 crc kubenswrapper[4898]: I0313 14:20:59.088288 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.176:5353: i/o timeout" Mar 13 14:21:03 crc kubenswrapper[4898]: I0313 14:21:03.156313 4898 generic.go:334] "Generic (PLEG): container finished" podID="664deedc-3946-4205-98ad-21759d35d952" containerID="d1d45dfadc513f7c3e24c556ad6a45d1d7e050a0da971045c5c1181fc58bc3d2" exitCode=0 Mar 13 14:21:03 crc kubenswrapper[4898]: I0313 14:21:03.156881 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hm77q" event={"ID":"664deedc-3946-4205-98ad-21759d35d952","Type":"ContainerDied","Data":"d1d45dfadc513f7c3e24c556ad6a45d1d7e050a0da971045c5c1181fc58bc3d2"} Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.089308 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.176:5353: i/o timeout" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.089553 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.588288 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.677321 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dpzg\" (UniqueName: \"kubernetes.io/projected/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-kube-api-access-9dpzg\") pod \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.677420 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-sb\") pod \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.677513 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-config\") pod \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.677580 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-svc\") pod \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.677604 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-swift-storage-0\") pod \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.677659 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-nb\") pod \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.685094 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-kube-api-access-9dpzg" (OuterVolumeSpecName: "kube-api-access-9dpzg") pod "9f1520e0-d7d9-4992-9ca5-1b2e98313d33" (UID: "9f1520e0-d7d9-4992-9ca5-1b2e98313d33"). InnerVolumeSpecName "kube-api-access-9dpzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.734955 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9f1520e0-d7d9-4992-9ca5-1b2e98313d33" (UID: "9f1520e0-d7d9-4992-9ca5-1b2e98313d33"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.736739 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f1520e0-d7d9-4992-9ca5-1b2e98313d33" (UID: "9f1520e0-d7d9-4992-9ca5-1b2e98313d33"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.749661 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9f1520e0-d7d9-4992-9ca5-1b2e98313d33" (UID: "9f1520e0-d7d9-4992-9ca5-1b2e98313d33"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.751943 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9f1520e0-d7d9-4992-9ca5-1b2e98313d33" (UID: "9f1520e0-d7d9-4992-9ca5-1b2e98313d33"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.763807 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-config" (OuterVolumeSpecName: "config") pod "9f1520e0-d7d9-4992-9ca5-1b2e98313d33" (UID: "9f1520e0-d7d9-4992-9ca5-1b2e98313d33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.781623 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.781715 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.781735 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dpzg\" (UniqueName: \"kubernetes.io/projected/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-kube-api-access-9dpzg\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.781792 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.781817 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.781834 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:05 crc kubenswrapper[4898]: E0313 14:21:05.018281 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 13 14:21:05 crc kubenswrapper[4898]: E0313 14:21:05.018501 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nf6h584h58h5b7hbfh547hbbh5bfh5dfh686hfdhcch67dh648hddh75h58h9bhc8h654h6dh686h96h658h5f4h5f9hbbh667hb9hd4hf6h666q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ffnxt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(247749ae-204b-4e9c-ad1c-f5d924b6f211): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.180882 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" event={"ID":"9f1520e0-d7d9-4992-9ca5-1b2e98313d33","Type":"ContainerDied","Data":"e9c150eab9ccbad529dce767553bfd6f06b4b8420400e4a2a3ec291dc3f4b819"} Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.180970 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.222142 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-cz6vf"] Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.233782 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-cz6vf"] Mar 13 14:21:05 crc kubenswrapper[4898]: E0313 14:21:05.345257 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Mar 13 14:21:05 crc kubenswrapper[4898]: E0313 14:21:05.345717 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-28mft,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-zgt75_openstack(84a7fd24-4320-4c0e-8ded-0d455252a549): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:21:05 crc kubenswrapper[4898]: E0313 14:21:05.347096 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-zgt75" podUID="84a7fd24-4320-4c0e-8ded-0d455252a549" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.364252 4898 scope.go:117] "RemoveContainer" containerID="5d3511c79cb347abef153850ae05dbbda4e475dc7ccff6ee262eef01f5d5c250" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.378558 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hm77q" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.499215 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-config\") pod \"664deedc-3946-4205-98ad-21759d35d952\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.499279 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2nsj\" (UniqueName: \"kubernetes.io/projected/664deedc-3946-4205-98ad-21759d35d952-kube-api-access-s2nsj\") pod \"664deedc-3946-4205-98ad-21759d35d952\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.499334 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-combined-ca-bundle\") pod \"664deedc-3946-4205-98ad-21759d35d952\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.502920 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/664deedc-3946-4205-98ad-21759d35d952-kube-api-access-s2nsj" (OuterVolumeSpecName: "kube-api-access-s2nsj") pod "664deedc-3946-4205-98ad-21759d35d952" (UID: "664deedc-3946-4205-98ad-21759d35d952"). InnerVolumeSpecName "kube-api-access-s2nsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.526702 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "664deedc-3946-4205-98ad-21759d35d952" (UID: "664deedc-3946-4205-98ad-21759d35d952"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.535364 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-config" (OuterVolumeSpecName: "config") pod "664deedc-3946-4205-98ad-21759d35d952" (UID: "664deedc-3946-4205-98ad-21759d35d952"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.602458 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.602496 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2nsj\" (UniqueName: \"kubernetes.io/projected/664deedc-3946-4205-98ad-21759d35d952-kube-api-access-s2nsj\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.602508 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.757646 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" path="/var/lib/kubelet/pods/9f1520e0-d7d9-4992-9ca5-1b2e98313d33/volumes" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.201006 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hm77q" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.200987 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hm77q" event={"ID":"664deedc-3946-4205-98ad-21759d35d952","Type":"ContainerDied","Data":"6d9f47030df3927c7cd3d661b2414ebcd6b45bbd0a337a5460ac32062ae1c8cc"} Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.201394 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d9f47030df3927c7cd3d661b2414ebcd6b45bbd0a337a5460ac32062ae1c8cc" Mar 13 14:21:06 crc kubenswrapper[4898]: E0313 14:21:06.204014 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-zgt75" podUID="84a7fd24-4320-4c0e-8ded-0d455252a549" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.450258 4898 scope.go:117] "RemoveContainer" containerID="dfc29ad43e0c709dcb1ef1785b269ec83a25e185502f0001b9412c1b48560e60" Mar 13 14:21:06 crc kubenswrapper[4898]: E0313 14:21:06.506049 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 13 14:21:06 crc kubenswrapper[4898]: E0313 14:21:06.506482 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-82hs7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-ztp6c_openstack(193b05da-acb9-4512-a2ae-6c03450e6f05): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:21:06 crc kubenswrapper[4898]: E0313 14:21:06.509146 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-ztp6c" podUID="193b05da-acb9-4512-a2ae-6c03450e6f05" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.593794 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7gvqf"] Mar 13 14:21:06 crc kubenswrapper[4898]: E0313 14:21:06.594243 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="dnsmasq-dns" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.594263 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="dnsmasq-dns" Mar 13 14:21:06 crc kubenswrapper[4898]: E0313 14:21:06.594287 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="init" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.594294 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="init" Mar 13 14:21:06 crc kubenswrapper[4898]: E0313 14:21:06.594333 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="664deedc-3946-4205-98ad-21759d35d952" containerName="neutron-db-sync" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.594339 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="664deedc-3946-4205-98ad-21759d35d952" containerName="neutron-db-sync" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.594532 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="664deedc-3946-4205-98ad-21759d35d952" containerName="neutron-db-sync" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.594551 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="dnsmasq-dns" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.602517 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.625728 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7gvqf"] Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.693304 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f97c64464-wmnph"] Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.695514 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.699928 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.700005 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-db2fv" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.700304 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.700789 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.705941 4898 scope.go:117] "RemoveContainer" containerID="0df5bf75681ada8dd8a40c022de81b8eec9f36333dccaa2326ce28596ea32dd8" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.722788 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f97c64464-wmnph"] Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.746630 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-svc\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.746771 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.747277 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-config\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.747340 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5758\" (UniqueName: \"kubernetes.io/projected/da9c8289-b4cc-4259-a94e-fab15f437c67-kube-api-access-q5758\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.747453 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.747495 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.767644 4898 scope.go:117] "RemoveContainer" containerID="ce178a8ab6289ecc30ada8030035df3e75b119b24ae01060c42a05e394597ae3" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.822881 4898 scope.go:117] "RemoveContainer" containerID="d7d9b7775bd2555a7c4636350359292fb8c65661ac123c18de4ec1ec3c6ad5d5" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849438 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-config\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849496 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5758\" (UniqueName: \"kubernetes.io/projected/da9c8289-b4cc-4259-a94e-fab15f437c67-kube-api-access-q5758\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849561 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-httpd-config\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849582 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849601 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849644 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-config\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849666 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rskd\" (UniqueName: \"kubernetes.io/projected/61f1f8bf-63eb-464c-9703-3d3db80ba0df-kube-api-access-6rskd\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849739 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-combined-ca-bundle\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849805 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-svc\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849842 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849863 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-ovndb-tls-certs\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.852995 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.854015 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.854015 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-svc\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.855208 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-config\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.857397 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.872913 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5758\" (UniqueName: \"kubernetes.io/projected/da9c8289-b4cc-4259-a94e-fab15f437c67-kube-api-access-q5758\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.952391 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-httpd-config\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.952472 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-config\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.952508 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rskd\" (UniqueName: \"kubernetes.io/projected/61f1f8bf-63eb-464c-9703-3d3db80ba0df-kube-api-access-6rskd\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.952568 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-combined-ca-bundle\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.952664 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-ovndb-tls-certs\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.953581 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.958031 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-ovndb-tls-certs\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.959692 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-config\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.961132 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-httpd-config\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.971269 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-combined-ca-bundle\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:06.999525 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rskd\" (UniqueName: \"kubernetes.io/projected/61f1f8bf-63eb-464c-9703-3d3db80ba0df-kube-api-access-6rskd\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:07.023869 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:07.285593 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dddqm" event={"ID":"51a3e0c5-0084-4216-a162-3614eafcc162","Type":"ContainerStarted","Data":"27760265b5d44dc57e3a3eecff9d010cc5fc5af8472653848b227f366d4e7a49"} Mar 13 14:21:07 crc kubenswrapper[4898]: E0313 14:21:07.290926 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-ztp6c" podUID="193b05da-acb9-4512-a2ae-6c03450e6f05" Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:07.327957 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-dddqm" podStartSLOduration=4.672014055 podStartE2EDuration="32.327942241s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="2026-03-13 14:20:37.69029243 +0000 UTC m=+1472.691880669" lastFinishedPulling="2026-03-13 14:21:05.346220616 +0000 UTC m=+1500.347808855" observedRunningTime="2026-03-13 14:21:07.327746385 +0000 UTC m=+1502.329334644" watchObservedRunningTime="2026-03-13 14:21:07.327942241 +0000 UTC m=+1502.329530480" Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:07.407370 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ljct7"] Mar 13 14:21:07 crc kubenswrapper[4898]: W0313 14:21:07.432537 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f68a4dd_fec8_4e60_a89c_69ce09fc5700.slice/crio-b5807007919517093647e5507bfed140e507af6921560a0e8eff012955dab574 WatchSource:0}: Error finding container b5807007919517093647e5507bfed140e507af6921560a0e8eff012955dab574: Status 404 returned error can't find the container with id b5807007919517093647e5507bfed140e507af6921560a0e8eff012955dab574 Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:07.517033 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:07.611531 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:07.654170 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7gvqf"] Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:07.974253 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f97c64464-wmnph"] Mar 13 14:21:08 crc kubenswrapper[4898]: W0313 14:21:08.129522 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61f1f8bf_63eb_464c_9703_3d3db80ba0df.slice/crio-0ed18fce368036b02906d995e0994f8f4a7d0e035afb43b24689faf9dc556daf WatchSource:0}: Error finding container 0ed18fce368036b02906d995e0994f8f4a7d0e035afb43b24689faf9dc556daf: Status 404 returned error can't find the container with id 0ed18fce368036b02906d995e0994f8f4a7d0e035afb43b24689faf9dc556daf Mar 13 14:21:08 crc kubenswrapper[4898]: W0313 14:21:08.131705 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8312dc9_a2b4_4ee6_b34f_cb984c14ad21.slice/crio-8cae8ab663d1286eb25519e52a961629328c7b2280f2d0209ef1752767b57b37 WatchSource:0}: Error finding container 8cae8ab663d1286eb25519e52a961629328c7b2280f2d0209ef1752767b57b37: Status 404 returned error can't find the container with id 8cae8ab663d1286eb25519e52a961629328c7b2280f2d0209ef1752767b57b37 Mar 13 14:21:08 crc kubenswrapper[4898]: W0313 14:21:08.132968 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda9c8289_b4cc_4259_a94e_fab15f437c67.slice/crio-2e5e30a7ea9bdd40efe150ba12afa0ed754dfac9e9a1ebb3308dac4a35e5fade WatchSource:0}: Error finding container 2e5e30a7ea9bdd40efe150ba12afa0ed754dfac9e9a1ebb3308dac4a35e5fade: Status 404 returned error can't find the container with id 2e5e30a7ea9bdd40efe150ba12afa0ed754dfac9e9a1ebb3308dac4a35e5fade Mar 13 14:21:08 crc kubenswrapper[4898]: I0313 14:21:08.334133 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21","Type":"ContainerStarted","Data":"8cae8ab663d1286eb25519e52a961629328c7b2280f2d0209ef1752767b57b37"} Mar 13 14:21:08 crc kubenswrapper[4898]: I0313 14:21:08.353685 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f97c64464-wmnph" event={"ID":"61f1f8bf-63eb-464c-9703-3d3db80ba0df","Type":"ContainerStarted","Data":"0ed18fce368036b02906d995e0994f8f4a7d0e035afb43b24689faf9dc556daf"} Mar 13 14:21:08 crc kubenswrapper[4898]: I0313 14:21:08.356859 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" event={"ID":"da9c8289-b4cc-4259-a94e-fab15f437c67","Type":"ContainerStarted","Data":"2e5e30a7ea9bdd40efe150ba12afa0ed754dfac9e9a1ebb3308dac4a35e5fade"} Mar 13 14:21:08 crc kubenswrapper[4898]: I0313 14:21:08.358378 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ljct7" event={"ID":"0f68a4dd-fec8-4e60-a89c-69ce09fc5700","Type":"ContainerStarted","Data":"b5807007919517093647e5507bfed140e507af6921560a0e8eff012955dab574"} Mar 13 14:21:08 crc kubenswrapper[4898]: I0313 14:21:08.374621 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f772c247-f65b-4185-9c75-25d5894ada70","Type":"ContainerStarted","Data":"ddd1664b14e1ff4c8657d63bc705f6e2cc8530fd54bcfec783c314238117e1e0"} Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.089777 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.176:5353: i/o timeout" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.161330 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z7ldc"] Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.164502 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.171073 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2469\" (UniqueName: \"kubernetes.io/projected/b38f3681-6f2f-437f-9694-810d43921aa2-kube-api-access-v2469\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.171213 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-catalog-content\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.171400 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-utilities\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.196425 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z7ldc"] Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.258635 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8495ffcdcc-j7d29"] Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.260959 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.268543 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.268713 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.275790 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-public-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.275844 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-config\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.275913 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-internal-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.275941 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2469\" (UniqueName: \"kubernetes.io/projected/b38f3681-6f2f-437f-9694-810d43921aa2-kube-api-access-v2469\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.276005 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvf8x\" (UniqueName: \"kubernetes.io/projected/194cc0b9-5fb1-492c-9df1-002f629cfb90-kube-api-access-pvf8x\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.276041 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-catalog-content\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.276096 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-combined-ca-bundle\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.276133 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-httpd-config\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.276181 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-utilities\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.276238 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-ovndb-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.277922 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-catalog-content\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.278065 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-utilities\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.294227 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8495ffcdcc-j7d29"] Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.300793 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2469\" (UniqueName: \"kubernetes.io/projected/b38f3681-6f2f-437f-9694-810d43921aa2-kube-api-access-v2469\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.379858 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvf8x\" (UniqueName: \"kubernetes.io/projected/194cc0b9-5fb1-492c-9df1-002f629cfb90-kube-api-access-pvf8x\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.380114 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-combined-ca-bundle\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.380263 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-httpd-config\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.380505 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-ovndb-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.380608 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-public-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.380640 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-config\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.380696 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-internal-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.386485 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-internal-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.392285 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-public-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.395083 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-ovndb-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.397080 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-combined-ca-bundle\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.398226 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-config\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.398671 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-httpd-config\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.405753 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvf8x\" (UniqueName: \"kubernetes.io/projected/194cc0b9-5fb1-492c-9df1-002f629cfb90-kube-api-access-pvf8x\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.417613 4898 generic.go:334] "Generic (PLEG): container finished" podID="da9c8289-b4cc-4259-a94e-fab15f437c67" containerID="ebec6588ea54fc0e12abfc618cba17e32b0384e26af2c4dc5438fd6e04229c34" exitCode=0 Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.417681 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" event={"ID":"da9c8289-b4cc-4259-a94e-fab15f437c67","Type":"ContainerDied","Data":"ebec6588ea54fc0e12abfc618cba17e32b0384e26af2c4dc5438fd6e04229c34"} Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.425103 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ljct7" event={"ID":"0f68a4dd-fec8-4e60-a89c-69ce09fc5700","Type":"ContainerStarted","Data":"ededb2c682deb7693ab4f5295aba59c30d96d3df7afffb98859fdbc9fc5b9e13"} Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.447668 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"247749ae-204b-4e9c-ad1c-f5d924b6f211","Type":"ContainerStarted","Data":"9dee453ab58c346884f20f75be928701596b9ae0bdbbb025979e1e2daaa907c1"} Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.457311 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f772c247-f65b-4185-9c75-25d5894ada70","Type":"ContainerStarted","Data":"3ed25c8fea8488646787dd34274ccdeef192849b7dfb335966843f92351f741e"} Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.459245 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21","Type":"ContainerStarted","Data":"e53933d3c42586f8c8f9ea54060e1af27f64d65b3366e751904358fe342bcf4d"} Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.477082 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f97c64464-wmnph" event={"ID":"61f1f8bf-63eb-464c-9703-3d3db80ba0df","Type":"ContainerStarted","Data":"fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4"} Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.477127 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f97c64464-wmnph" event={"ID":"61f1f8bf-63eb-464c-9703-3d3db80ba0df","Type":"ContainerStarted","Data":"6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded"} Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.477991 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.491404 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ljct7" podStartSLOduration=25.491388752 podStartE2EDuration="25.491388752s" podCreationTimestamp="2026-03-13 14:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:09.485413157 +0000 UTC m=+1504.487001406" watchObservedRunningTime="2026-03-13 14:21:09.491388752 +0000 UTC m=+1504.492976991" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.520163 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.581282 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f97c64464-wmnph" podStartSLOduration=3.581256105 podStartE2EDuration="3.581256105s" podCreationTimestamp="2026-03-13 14:21:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:09.556469572 +0000 UTC m=+1504.558057821" watchObservedRunningTime="2026-03-13 14:21:09.581256105 +0000 UTC m=+1504.582844344" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.607831 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.818466 4898 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod10787742-bffc-4545-95cc-8f0354246d7c"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod10787742-bffc-4545-95cc-8f0354246d7c] : Timed out while waiting for systemd to remove kubepods-besteffort-pod10787742_bffc_4545_95cc_8f0354246d7c.slice" Mar 13 14:21:09 crc kubenswrapper[4898]: E0313 14:21:09.818506 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod10787742-bffc-4545-95cc-8f0354246d7c] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod10787742-bffc-4545-95cc-8f0354246d7c] : Timed out while waiting for systemd to remove kubepods-besteffort-pod10787742_bffc_4545_95cc_8f0354246d7c.slice" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" podUID="10787742-bffc-4545-95cc-8f0354246d7c" Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.328252 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z7ldc"] Mar 13 14:21:10 crc kubenswrapper[4898]: W0313 14:21:10.355517 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb38f3681_6f2f_437f_9694_810d43921aa2.slice/crio-14f6fca3b948b0af4024656eeff03f1e8c8fe427562c97293c95f1bdd3284d24 WatchSource:0}: Error finding container 14f6fca3b948b0af4024656eeff03f1e8c8fe427562c97293c95f1bdd3284d24: Status 404 returned error can't find the container with id 14f6fca3b948b0af4024656eeff03f1e8c8fe427562c97293c95f1bdd3284d24 Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.505102 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" event={"ID":"da9c8289-b4cc-4259-a94e-fab15f437c67","Type":"ContainerStarted","Data":"ec23679d2099538d875f32f1740477e86fa4744d0786a72fbd40a7500fbf13f8"} Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.522934 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.545742 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" podStartSLOduration=4.545718503 podStartE2EDuration="4.545718503s" podCreationTimestamp="2026-03-13 14:21:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:10.522818398 +0000 UTC m=+1505.524406637" watchObservedRunningTime="2026-03-13 14:21:10.545718503 +0000 UTC m=+1505.547306742" Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.558486 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f772c247-f65b-4185-9c75-25d5894ada70","Type":"ContainerStarted","Data":"53ad13c5a81b4a9991a57956cff297d785da3080bb5eafedd89860a32e28cd6a"} Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.571140 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8495ffcdcc-j7d29"] Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.572222 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7ldc" event={"ID":"b38f3681-6f2f-437f-9694-810d43921aa2","Type":"ContainerStarted","Data":"14f6fca3b948b0af4024656eeff03f1e8c8fe427562c97293c95f1bdd3284d24"} Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.595005 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=25.594987401 podStartE2EDuration="25.594987401s" podCreationTimestamp="2026-03-13 14:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:10.58759576 +0000 UTC m=+1505.589184019" watchObservedRunningTime="2026-03-13 14:21:10.594987401 +0000 UTC m=+1505.596575640" Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.603290 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.604456 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21","Type":"ContainerStarted","Data":"58217feb23785534a405127b1afdf5fb04709d2e86145617a9cb9b01bf24630f"} Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.647147 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=25.647123115 podStartE2EDuration="25.647123115s" podCreationTimestamp="2026-03-13 14:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:10.635451612 +0000 UTC m=+1505.637039871" watchObservedRunningTime="2026-03-13 14:21:10.647123115 +0000 UTC m=+1505.648711354" Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.861428 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jcdtp"] Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.897385 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jcdtp"] Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.615713 4898 generic.go:334] "Generic (PLEG): container finished" podID="b38f3681-6f2f-437f-9694-810d43921aa2" containerID="d6ed263f1fe660123646c8c6128f780dbe747c9b3a543fa08475d3acfc1517d5" exitCode=0 Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.615801 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7ldc" event={"ID":"b38f3681-6f2f-437f-9694-810d43921aa2","Type":"ContainerDied","Data":"d6ed263f1fe660123646c8c6128f780dbe747c9b3a543fa08475d3acfc1517d5"} Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.618232 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8495ffcdcc-j7d29" event={"ID":"194cc0b9-5fb1-492c-9df1-002f629cfb90","Type":"ContainerStarted","Data":"0cd127d8d8dce19d301ba8f94cb9ff0fc6150499598490bd11d503771e98d4fc"} Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.618264 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8495ffcdcc-j7d29" event={"ID":"194cc0b9-5fb1-492c-9df1-002f629cfb90","Type":"ContainerStarted","Data":"0d21f5d009c2fbb3d9136543ddc9edaf66018231eb09d0ea5bf4fa35c2144f9b"} Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.618274 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8495ffcdcc-j7d29" event={"ID":"194cc0b9-5fb1-492c-9df1-002f629cfb90","Type":"ContainerStarted","Data":"675cba3392edee8aa8b36b03aeac2453edffb374fe9e8c521c269c0464cb1478"} Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.618379 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.619764 4898 generic.go:334] "Generic (PLEG): container finished" podID="51a3e0c5-0084-4216-a162-3614eafcc162" containerID="27760265b5d44dc57e3a3eecff9d010cc5fc5af8472653848b227f366d4e7a49" exitCode=0 Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.620218 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dddqm" event={"ID":"51a3e0c5-0084-4216-a162-3614eafcc162","Type":"ContainerDied","Data":"27760265b5d44dc57e3a3eecff9d010cc5fc5af8472653848b227f366d4e7a49"} Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.669954 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8495ffcdcc-j7d29" podStartSLOduration=2.669626349 podStartE2EDuration="2.669626349s" podCreationTimestamp="2026-03-13 14:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:11.65426425 +0000 UTC m=+1506.655852489" watchObservedRunningTime="2026-03-13 14:21:11.669626349 +0000 UTC m=+1506.671214588" Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.771135 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10787742-bffc-4545-95cc-8f0354246d7c" path="/var/lib/kubelet/pods/10787742-bffc-4545-95cc-8f0354246d7c/volumes" Mar 13 14:21:12 crc kubenswrapper[4898]: I0313 14:21:12.634454 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xq6ss" event={"ID":"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83","Type":"ContainerStarted","Data":"e195371c387c0ec69bbadee68addfd45715bbfc83433b2ba3c47b307af7325bd"} Mar 13 14:21:12 crc kubenswrapper[4898]: I0313 14:21:12.655079 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-xq6ss" podStartSLOduration=4.005064622 podStartE2EDuration="37.655060551s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="2026-03-13 14:20:37.718328168 +0000 UTC m=+1472.719916407" lastFinishedPulling="2026-03-13 14:21:11.368324107 +0000 UTC m=+1506.369912336" observedRunningTime="2026-03-13 14:21:12.652283228 +0000 UTC m=+1507.653871477" watchObservedRunningTime="2026-03-13 14:21:12.655060551 +0000 UTC m=+1507.656648800" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.529845 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dddqm" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.609671 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51a3e0c5-0084-4216-a162-3614eafcc162-logs\") pod \"51a3e0c5-0084-4216-a162-3614eafcc162\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.609746 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st255\" (UniqueName: \"kubernetes.io/projected/51a3e0c5-0084-4216-a162-3614eafcc162-kube-api-access-st255\") pod \"51a3e0c5-0084-4216-a162-3614eafcc162\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.610113 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-scripts\") pod \"51a3e0c5-0084-4216-a162-3614eafcc162\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.610208 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-config-data\") pod \"51a3e0c5-0084-4216-a162-3614eafcc162\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.610240 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a3e0c5-0084-4216-a162-3614eafcc162-logs" (OuterVolumeSpecName: "logs") pod "51a3e0c5-0084-4216-a162-3614eafcc162" (UID: "51a3e0c5-0084-4216-a162-3614eafcc162"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.610263 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-combined-ca-bundle\") pod \"51a3e0c5-0084-4216-a162-3614eafcc162\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.611033 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51a3e0c5-0084-4216-a162-3614eafcc162-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.616598 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-scripts" (OuterVolumeSpecName: "scripts") pod "51a3e0c5-0084-4216-a162-3614eafcc162" (UID: "51a3e0c5-0084-4216-a162-3614eafcc162"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.622088 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a3e0c5-0084-4216-a162-3614eafcc162-kube-api-access-st255" (OuterVolumeSpecName: "kube-api-access-st255") pod "51a3e0c5-0084-4216-a162-3614eafcc162" (UID: "51a3e0c5-0084-4216-a162-3614eafcc162"). InnerVolumeSpecName "kube-api-access-st255". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.649574 4898 generic.go:334] "Generic (PLEG): container finished" podID="0f68a4dd-fec8-4e60-a89c-69ce09fc5700" containerID="ededb2c682deb7693ab4f5295aba59c30d96d3df7afffb98859fdbc9fc5b9e13" exitCode=0 Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.649661 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ljct7" event={"ID":"0f68a4dd-fec8-4e60-a89c-69ce09fc5700","Type":"ContainerDied","Data":"ededb2c682deb7693ab4f5295aba59c30d96d3df7afffb98859fdbc9fc5b9e13"} Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.651776 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51a3e0c5-0084-4216-a162-3614eafcc162" (UID: "51a3e0c5-0084-4216-a162-3614eafcc162"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.660789 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dddqm" event={"ID":"51a3e0c5-0084-4216-a162-3614eafcc162","Type":"ContainerDied","Data":"4656230f194edb795af2740a5c0bb83bbde4d4a8b2fd3cee4accb6bdc572ea5d"} Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.660837 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4656230f194edb795af2740a5c0bb83bbde4d4a8b2fd3cee4accb6bdc572ea5d" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.661179 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dddqm" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.672338 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-config-data" (OuterVolumeSpecName: "config-data") pod "51a3e0c5-0084-4216-a162-3614eafcc162" (UID: "51a3e0c5-0084-4216-a162-3614eafcc162"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.723517 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st255\" (UniqueName: \"kubernetes.io/projected/51a3e0c5-0084-4216-a162-3614eafcc162-kube-api-access-st255\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.723552 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.723562 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.723573 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.844037 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5bf5d8b7d4-4gwxr"] Mar 13 14:21:13 crc kubenswrapper[4898]: E0313 14:21:13.844589 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a3e0c5-0084-4216-a162-3614eafcc162" containerName="placement-db-sync" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.844610 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a3e0c5-0084-4216-a162-3614eafcc162" containerName="placement-db-sync" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.844806 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a3e0c5-0084-4216-a162-3614eafcc162" containerName="placement-db-sync" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.846056 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.850378 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.850624 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.876353 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bf5d8b7d4-4gwxr"] Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.935765 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-scripts\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.936060 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-public-tls-certs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.936240 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-internal-tls-certs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.936286 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-config-data\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.936306 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-combined-ca-bundle\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.936436 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/604a0205-6c18-4bff-929f-038524d62aeb-logs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.936528 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lcxt\" (UniqueName: \"kubernetes.io/projected/604a0205-6c18-4bff-929f-038524d62aeb-kube-api-access-8lcxt\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.037891 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-scripts\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.037964 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-public-tls-certs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.038016 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-internal-tls-certs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.038037 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-combined-ca-bundle\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.038053 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-config-data\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.038103 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/604a0205-6c18-4bff-929f-038524d62aeb-logs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.038139 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lcxt\" (UniqueName: \"kubernetes.io/projected/604a0205-6c18-4bff-929f-038524d62aeb-kube-api-access-8lcxt\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.039063 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/604a0205-6c18-4bff-929f-038524d62aeb-logs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.043800 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-config-data\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.045008 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-scripts\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.045488 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-internal-tls-certs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.045712 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-combined-ca-bundle\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.056161 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lcxt\" (UniqueName: \"kubernetes.io/projected/604a0205-6c18-4bff-929f-038524d62aeb-kube-api-access-8lcxt\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.057411 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-public-tls-certs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.173421 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.668675 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.669020 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.669035 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.669047 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.682036 4898 generic.go:334] "Generic (PLEG): container finished" podID="ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" containerID="e195371c387c0ec69bbadee68addfd45715bbfc83433b2ba3c47b307af7325bd" exitCode=0 Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.682121 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xq6ss" event={"ID":"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83","Type":"ContainerDied","Data":"e195371c387c0ec69bbadee68addfd45715bbfc83433b2ba3c47b307af7325bd"} Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.684407 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ljct7" event={"ID":"0f68a4dd-fec8-4e60-a89c-69ce09fc5700","Type":"ContainerDied","Data":"b5807007919517093647e5507bfed140e507af6921560a0e8eff012955dab574"} Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.684456 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5807007919517093647e5507bfed140e507af6921560a0e8eff012955dab574" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.703952 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.704457 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.704473 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.704486 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.736496 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.736970 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.776705 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.776929 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.928077 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.989435 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9s6v\" (UniqueName: \"kubernetes.io/projected/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-kube-api-access-j9s6v\") pod \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.989557 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-combined-ca-bundle\") pod \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.989645 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-scripts\") pod \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.989716 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-credential-keys\") pod \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.989747 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-fernet-keys\") pod \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.989770 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-config-data\") pod \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.995341 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0f68a4dd-fec8-4e60-a89c-69ce09fc5700" (UID: "0f68a4dd-fec8-4e60-a89c-69ce09fc5700"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.997084 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0f68a4dd-fec8-4e60-a89c-69ce09fc5700" (UID: "0f68a4dd-fec8-4e60-a89c-69ce09fc5700"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.997095 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-kube-api-access-j9s6v" (OuterVolumeSpecName: "kube-api-access-j9s6v") pod "0f68a4dd-fec8-4e60-a89c-69ce09fc5700" (UID: "0f68a4dd-fec8-4e60-a89c-69ce09fc5700"). InnerVolumeSpecName "kube-api-access-j9s6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.997111 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-scripts" (OuterVolumeSpecName: "scripts") pod "0f68a4dd-fec8-4e60-a89c-69ce09fc5700" (UID: "0f68a4dd-fec8-4e60-a89c-69ce09fc5700"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.067613 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f68a4dd-fec8-4e60-a89c-69ce09fc5700" (UID: "0f68a4dd-fec8-4e60-a89c-69ce09fc5700"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.078863 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-config-data" (OuterVolumeSpecName: "config-data") pod "0f68a4dd-fec8-4e60-a89c-69ce09fc5700" (UID: "0f68a4dd-fec8-4e60-a89c-69ce09fc5700"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.092069 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.092100 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9s6v\" (UniqueName: \"kubernetes.io/projected/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-kube-api-access-j9s6v\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.092111 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.092120 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.092128 4898 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.092136 4898 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:16 crc kubenswrapper[4898]: W0313 14:21:16.238942 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod604a0205_6c18_4bff_929f_038524d62aeb.slice/crio-762f4a73911e1feb65bcc4bfa54920b33c4f644904f274acd49cf5dac053c911 WatchSource:0}: Error finding container 762f4a73911e1feb65bcc4bfa54920b33c4f644904f274acd49cf5dac053c911: Status 404 returned error can't find the container with id 762f4a73911e1feb65bcc4bfa54920b33c4f644904f274acd49cf5dac053c911 Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.245319 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bf5d8b7d4-4gwxr"] Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.699262 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"247749ae-204b-4e9c-ad1c-f5d924b6f211","Type":"ContainerStarted","Data":"54431181e3eb3b359d0852277dd7b5d798e5ebd64616fc6928567613ce28f709"} Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.702061 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bf5d8b7d4-4gwxr" event={"ID":"604a0205-6c18-4bff-929f-038524d62aeb","Type":"ContainerStarted","Data":"7d8e485964b16b478ba92c0abf89ef5c7fe78d1b940606e5a01f0ef264ccdb32"} Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.702201 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bf5d8b7d4-4gwxr" event={"ID":"604a0205-6c18-4bff-929f-038524d62aeb","Type":"ContainerStarted","Data":"762f4a73911e1feb65bcc4bfa54920b33c4f644904f274acd49cf5dac053c911"} Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.705660 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7ldc" event={"ID":"b38f3681-6f2f-437f-9694-810d43921aa2","Type":"ContainerStarted","Data":"c5557297cdf8c10622794d499e8dd04fb952d7400a53b9ebeaf103b83d901e50"} Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.706142 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.955057 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.043427 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fbs4f"] Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.043696 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" podUID="04642207-fab0-47bf-9ac4-030bbe91b4f0" containerName="dnsmasq-dns" containerID="cri-o://b87af33e145b04333fd2674b6cd7ae8ffea23dd169547cec058f2a13473c72c5" gracePeriod=10 Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.145106 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-87574c74-kqmjb"] Mar 13 14:21:17 crc kubenswrapper[4898]: E0313 14:21:17.145548 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f68a4dd-fec8-4e60-a89c-69ce09fc5700" containerName="keystone-bootstrap" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.145560 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f68a4dd-fec8-4e60-a89c-69ce09fc5700" containerName="keystone-bootstrap" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.145772 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f68a4dd-fec8-4e60-a89c-69ce09fc5700" containerName="keystone-bootstrap" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.146474 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.152310 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.152366 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.152659 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tdc5n" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.152783 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.152883 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.153891 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-87574c74-kqmjb"] Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.154862 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.233315 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-credential-keys\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.233365 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-fernet-keys\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.233411 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-internal-tls-certs\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.233432 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-config-data\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.233471 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4m2t\" (UniqueName: \"kubernetes.io/projected/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-kube-api-access-l4m2t\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.233499 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-scripts\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.233515 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-public-tls-certs\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.233543 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-combined-ca-bundle\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.258883 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.334429 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-combined-ca-bundle\") pod \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.334605 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwfsv\" (UniqueName: \"kubernetes.io/projected/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-kube-api-access-cwfsv\") pod \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.334736 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-db-sync-config-data\") pod \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.335022 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-credential-keys\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.335048 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-fernet-keys\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.335097 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-internal-tls-certs\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.335119 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-config-data\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.335160 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4m2t\" (UniqueName: \"kubernetes.io/projected/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-kube-api-access-l4m2t\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.335191 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-scripts\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.335209 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-public-tls-certs\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.335237 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-combined-ca-bundle\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.356735 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-public-tls-certs\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.366518 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4m2t\" (UniqueName: \"kubernetes.io/projected/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-kube-api-access-l4m2t\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.378492 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-scripts\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.379761 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-config-data\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.381359 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-fernet-keys\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.381610 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-combined-ca-bundle\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.382104 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" (UID: "ac704482-c7a4-471c-b3c1-d1fdd7e0eb83"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.382279 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-kube-api-access-cwfsv" (OuterVolumeSpecName: "kube-api-access-cwfsv") pod "ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" (UID: "ac704482-c7a4-471c-b3c1-d1fdd7e0eb83"). InnerVolumeSpecName "kube-api-access-cwfsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.386456 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-internal-tls-certs\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.390401 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-credential-keys\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.437582 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwfsv\" (UniqueName: \"kubernetes.io/projected/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-kube-api-access-cwfsv\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.437828 4898 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.451093 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" (UID: "ac704482-c7a4-471c-b3c1-d1fdd7e0eb83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.540276 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.571739 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.720051 4898 generic.go:334] "Generic (PLEG): container finished" podID="b38f3681-6f2f-437f-9694-810d43921aa2" containerID="c5557297cdf8c10622794d499e8dd04fb952d7400a53b9ebeaf103b83d901e50" exitCode=0 Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.721669 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7ldc" event={"ID":"b38f3681-6f2f-437f-9694-810d43921aa2","Type":"ContainerDied","Data":"c5557297cdf8c10622794d499e8dd04fb952d7400a53b9ebeaf103b83d901e50"} Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.742241 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.761062 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xq6ss" event={"ID":"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83","Type":"ContainerDied","Data":"bafefd0ee86b1967f73e5ee3d1256b9f0f1d84430cc94cf6628cfc6e827e8aad"} Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.761105 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bafefd0ee86b1967f73e5ee3d1256b9f0f1d84430cc94cf6628cfc6e827e8aad" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.785076 4898 generic.go:334] "Generic (PLEG): container finished" podID="04642207-fab0-47bf-9ac4-030bbe91b4f0" containerID="b87af33e145b04333fd2674b6cd7ae8ffea23dd169547cec058f2a13473c72c5" exitCode=0 Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.785167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" event={"ID":"04642207-fab0-47bf-9ac4-030bbe91b4f0","Type":"ContainerDied","Data":"b87af33e145b04333fd2674b6cd7ae8ffea23dd169547cec058f2a13473c72c5"} Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.802838 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bf5d8b7d4-4gwxr" event={"ID":"604a0205-6c18-4bff-929f-038524d62aeb","Type":"ContainerStarted","Data":"974d31c9fc27c07800ab40a1409496611af9fce07ca6cd7d36cd6abbd352b819"} Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.803467 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.803508 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.932280 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5bf5d8b7d4-4gwxr" podStartSLOduration=4.932248543 podStartE2EDuration="4.932248543s" podCreationTimestamp="2026-03-13 14:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:17.892769448 +0000 UTC m=+1512.894357697" watchObservedRunningTime="2026-03-13 14:21:17.932248543 +0000 UTC m=+1512.933836782" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.950485 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-795749dc8c-sm2hl"] Mar 13 14:21:17 crc kubenswrapper[4898]: E0313 14:21:17.950962 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" containerName="barbican-db-sync" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.950978 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" containerName="barbican-db-sync" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.951181 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" containerName="barbican-db-sync" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.954648 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.959276 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.959499 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rkpbr" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.964662 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.001009 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-795749dc8c-sm2hl"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.069385 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b16e588-d353-4100-b143-b84420c42e30-logs\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.069431 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-combined-ca-bundle\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.069601 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-config-data-custom\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.069625 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-config-data\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.069647 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f995q\" (UniqueName: \"kubernetes.io/projected/8b16e588-d353-4100-b143-b84420c42e30-kube-api-access-f995q\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.090211 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-fcdc98bd8-xdl6x"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.092085 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.095231 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.111091 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-fcdc98bd8-xdl6x"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.130943 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8d6wc"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.133102 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.141135 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8d6wc"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172172 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk59q\" (UniqueName: \"kubernetes.io/projected/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-kube-api-access-gk59q\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172236 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b16e588-d353-4100-b143-b84420c42e30-logs\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172263 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-combined-ca-bundle\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172278 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-logs\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172299 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-config-data-custom\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172315 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-combined-ca-bundle\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172384 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-config-data\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172462 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-config-data-custom\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172484 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-config-data\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172506 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f995q\" (UniqueName: \"kubernetes.io/projected/8b16e588-d353-4100-b143-b84420c42e30-kube-api-access-f995q\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.173215 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b16e588-d353-4100-b143-b84420c42e30-logs\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.181527 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-combined-ca-bundle\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.189555 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-config-data-custom\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.203603 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-config-data\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.218481 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f995q\" (UniqueName: \"kubernetes.io/projected/8b16e588-d353-4100-b143-b84420c42e30-kube-api-access-f995q\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276321 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-config\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276440 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk59q\" (UniqueName: \"kubernetes.io/projected/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-kube-api-access-gk59q\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276513 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-logs\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276532 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276559 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-config-data-custom\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276575 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-combined-ca-bundle\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276612 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmplv\" (UniqueName: \"kubernetes.io/projected/45af301d-29c9-474d-be0d-4d91f6d0cb18-kube-api-access-hmplv\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276658 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276680 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-config-data\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276719 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276745 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276851 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-logs\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.282191 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d74d977fd-v5m5s"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.286552 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.288449 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-config-data-custom\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.297960 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.300434 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d74d977fd-v5m5s"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.312545 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.321277 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-combined-ca-bundle\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.321749 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-config-data\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.334748 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk59q\" (UniqueName: \"kubernetes.io/projected/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-kube-api-access-gk59q\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.383932 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.383994 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384063 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data-custom\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384086 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-config\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384107 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-combined-ca-bundle\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384173 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384205 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmplv\" (UniqueName: \"kubernetes.io/projected/45af301d-29c9-474d-be0d-4d91f6d0cb18-kube-api-access-hmplv\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384250 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384269 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384288 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt6km\" (UniqueName: \"kubernetes.io/projected/7dcea9de-db8a-42dd-958c-59df43a49ff3-kube-api-access-vt6km\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384321 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dcea9de-db8a-42dd-958c-59df43a49ff3-logs\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.385289 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-config\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.385562 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.385859 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.386424 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.388253 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.410566 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmplv\" (UniqueName: \"kubernetes.io/projected/45af301d-29c9-474d-be0d-4d91f6d0cb18-kube-api-access-hmplv\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.425277 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-87574c74-kqmjb"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.433783 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.467748 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.488389 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.488443 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt6km\" (UniqueName: \"kubernetes.io/projected/7dcea9de-db8a-42dd-958c-59df43a49ff3-kube-api-access-vt6km\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.488585 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dcea9de-db8a-42dd-958c-59df43a49ff3-logs\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.488825 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data-custom\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.489030 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-combined-ca-bundle\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.489321 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dcea9de-db8a-42dd-958c-59df43a49ff3-logs\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.493141 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.494691 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-combined-ca-bundle\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.495181 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data-custom\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.506693 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt6km\" (UniqueName: \"kubernetes.io/projected/7dcea9de-db8a-42dd-958c-59df43a49ff3-kube-api-access-vt6km\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.514520 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.590068 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-svc\") pod \"04642207-fab0-47bf-9ac4-030bbe91b4f0\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.590286 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqg7r\" (UniqueName: \"kubernetes.io/projected/04642207-fab0-47bf-9ac4-030bbe91b4f0-kube-api-access-cqg7r\") pod \"04642207-fab0-47bf-9ac4-030bbe91b4f0\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.590334 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-sb\") pod \"04642207-fab0-47bf-9ac4-030bbe91b4f0\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.590383 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-swift-storage-0\") pod \"04642207-fab0-47bf-9ac4-030bbe91b4f0\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.590433 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-nb\") pod \"04642207-fab0-47bf-9ac4-030bbe91b4f0\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.590480 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-config\") pod \"04642207-fab0-47bf-9ac4-030bbe91b4f0\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.603102 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04642207-fab0-47bf-9ac4-030bbe91b4f0-kube-api-access-cqg7r" (OuterVolumeSpecName: "kube-api-access-cqg7r") pod "04642207-fab0-47bf-9ac4-030bbe91b4f0" (UID: "04642207-fab0-47bf-9ac4-030bbe91b4f0"). InnerVolumeSpecName "kube-api-access-cqg7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.693340 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqg7r\" (UniqueName: \"kubernetes.io/projected/04642207-fab0-47bf-9ac4-030bbe91b4f0-kube-api-access-cqg7r\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.702043 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04642207-fab0-47bf-9ac4-030bbe91b4f0" (UID: "04642207-fab0-47bf-9ac4-030bbe91b4f0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.703373 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.716313 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04642207-fab0-47bf-9ac4-030bbe91b4f0" (UID: "04642207-fab0-47bf-9ac4-030bbe91b4f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.764965 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "04642207-fab0-47bf-9ac4-030bbe91b4f0" (UID: "04642207-fab0-47bf-9ac4-030bbe91b4f0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.783888 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04642207-fab0-47bf-9ac4-030bbe91b4f0" (UID: "04642207-fab0-47bf-9ac4-030bbe91b4f0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.795386 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.795416 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.795428 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.795437 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.809403 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-config" (OuterVolumeSpecName: "config") pod "04642207-fab0-47bf-9ac4-030bbe91b4f0" (UID: "04642207-fab0-47bf-9ac4-030bbe91b4f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.848108 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" event={"ID":"04642207-fab0-47bf-9ac4-030bbe91b4f0","Type":"ContainerDied","Data":"1f8aa6f56c769252ca6f9fa29c34832b03b0bd31e4320e8506b18e27d01c86a7"} Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.848173 4898 scope.go:117] "RemoveContainer" containerID="b87af33e145b04333fd2674b6cd7ae8ffea23dd169547cec058f2a13473c72c5" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.848339 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.860560 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-87574c74-kqmjb" event={"ID":"d149c7e3-df46-44b5-8a66-8a0fbb5a8554","Type":"ContainerStarted","Data":"324d55849ecdcac9eb7e6f876adb4165c9370aa4cdec80c831e30064c20235df"} Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.898639 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.933352 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-795749dc8c-sm2hl"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.946101 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fbs4f"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.956196 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fbs4f"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.962427 4898 scope.go:117] "RemoveContainer" containerID="a566cd3c2104d4cf2d4de94c8ef5556830e27cbf9385397fa0c4b4a48c1b947c" Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.134475 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.134738 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.246690 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-fcdc98bd8-xdl6x"] Mar 13 14:21:19 crc kubenswrapper[4898]: W0313 14:21:19.304793 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272aa2e8_f1ed_4a08_b5a3_aecd06c4c6d2.slice/crio-c487f36110bc4a44edbb608c4532dc8ca0f578132086cb593f98d555564c06a7 WatchSource:0}: Error finding container c487f36110bc4a44edbb608c4532dc8ca0f578132086cb593f98d555564c06a7: Status 404 returned error can't find the container with id c487f36110bc4a44edbb608c4532dc8ca0f578132086cb593f98d555564c06a7 Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.340750 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8d6wc"] Mar 13 14:21:19 crc kubenswrapper[4898]: W0313 14:21:19.378151 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45af301d_29c9_474d_be0d_4d91f6d0cb18.slice/crio-54b39114a6c35a297bd4da40e16643230d03c158b2784fe677b7a7dcda81e6ec WatchSource:0}: Error finding container 54b39114a6c35a297bd4da40e16643230d03c158b2784fe677b7a7dcda81e6ec: Status 404 returned error can't find the container with id 54b39114a6c35a297bd4da40e16643230d03c158b2784fe677b7a7dcda81e6ec Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.590479 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d74d977fd-v5m5s"] Mar 13 14:21:19 crc kubenswrapper[4898]: W0313 14:21:19.605744 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dcea9de_db8a_42dd_958c_59df43a49ff3.slice/crio-704f9e3ccfefb3b8a00bd2333c1274e405dc0d43a2c31a48365c27cab56dfc29 WatchSource:0}: Error finding container 704f9e3ccfefb3b8a00bd2333c1274e405dc0d43a2c31a48365c27cab56dfc29: Status 404 returned error can't find the container with id 704f9e3ccfefb3b8a00bd2333c1274e405dc0d43a2c31a48365c27cab56dfc29 Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.761729 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04642207-fab0-47bf-9ac4-030bbe91b4f0" path="/var/lib/kubelet/pods/04642207-fab0-47bf-9ac4-030bbe91b4f0/volumes" Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.888609 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-795749dc8c-sm2hl" event={"ID":"8b16e588-d353-4100-b143-b84420c42e30","Type":"ContainerStarted","Data":"84f09917c0d799d211cb85a17e721fb1eb7ed97fe7a2778a5684cc54c1a89bb8"} Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.897742 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d74d977fd-v5m5s" event={"ID":"7dcea9de-db8a-42dd-958c-59df43a49ff3","Type":"ContainerStarted","Data":"704f9e3ccfefb3b8a00bd2333c1274e405dc0d43a2c31a48365c27cab56dfc29"} Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.900133 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" event={"ID":"45af301d-29c9-474d-be0d-4d91f6d0cb18","Type":"ContainerStarted","Data":"54b39114a6c35a297bd4da40e16643230d03c158b2784fe677b7a7dcda81e6ec"} Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.903945 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" event={"ID":"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2","Type":"ContainerStarted","Data":"c487f36110bc4a44edbb608c4532dc8ca0f578132086cb593f98d555564c06a7"} Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.908210 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-87574c74-kqmjb" event={"ID":"d149c7e3-df46-44b5-8a66-8a0fbb5a8554","Type":"ContainerStarted","Data":"f93b65d81f70cb8f8e5f58a2832c1ab7b3a3ee434167e6ca4438f287c2a4218e"} Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.908503 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.936236 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-87574c74-kqmjb" podStartSLOduration=2.936214345 podStartE2EDuration="2.936214345s" podCreationTimestamp="2026-03-13 14:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:19.925188019 +0000 UTC m=+1514.926776258" watchObservedRunningTime="2026-03-13 14:21:19.936214345 +0000 UTC m=+1514.937802584" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.168060 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-b9dc95d4b-bvhlz"] Mar 13 14:21:21 crc kubenswrapper[4898]: E0313 14:21:21.169307 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04642207-fab0-47bf-9ac4-030bbe91b4f0" containerName="init" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.169329 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="04642207-fab0-47bf-9ac4-030bbe91b4f0" containerName="init" Mar 13 14:21:21 crc kubenswrapper[4898]: E0313 14:21:21.169385 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04642207-fab0-47bf-9ac4-030bbe91b4f0" containerName="dnsmasq-dns" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.169393 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="04642207-fab0-47bf-9ac4-030bbe91b4f0" containerName="dnsmasq-dns" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.169649 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="04642207-fab0-47bf-9ac4-030bbe91b4f0" containerName="dnsmasq-dns" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.171361 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.173882 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.174987 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.194069 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b9dc95d4b-bvhlz"] Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.268346 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4bf680-c8b7-4721-9595-9a8ed40410d2-logs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.268404 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hn49\" (UniqueName: \"kubernetes.io/projected/fd4bf680-c8b7-4721-9595-9a8ed40410d2-kube-api-access-5hn49\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.268464 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-internal-tls-certs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.268495 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-public-tls-certs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.268570 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-config-data\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.268610 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-config-data-custom\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.268748 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-combined-ca-bundle\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.370590 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hn49\" (UniqueName: \"kubernetes.io/projected/fd4bf680-c8b7-4721-9595-9a8ed40410d2-kube-api-access-5hn49\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.370646 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-internal-tls-certs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.370673 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-public-tls-certs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.370752 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-config-data\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.370814 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-config-data-custom\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.370947 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-combined-ca-bundle\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.371046 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4bf680-c8b7-4721-9595-9a8ed40410d2-logs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.371635 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4bf680-c8b7-4721-9595-9a8ed40410d2-logs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.377300 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-combined-ca-bundle\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.377600 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-public-tls-certs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.377634 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-config-data\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.378047 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-internal-tls-certs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.379798 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-config-data-custom\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.393278 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hn49\" (UniqueName: \"kubernetes.io/projected/fd4bf680-c8b7-4721-9595-9a8ed40410d2-kube-api-access-5hn49\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.494250 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:22 crc kubenswrapper[4898]: I0313 14:21:22.962051 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b9dc95d4b-bvhlz"] Mar 13 14:21:22 crc kubenswrapper[4898]: I0313 14:21:22.967181 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d74d977fd-v5m5s" event={"ID":"7dcea9de-db8a-42dd-958c-59df43a49ff3","Type":"ContainerStarted","Data":"8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221"} Mar 13 14:21:22 crc kubenswrapper[4898]: I0313 14:21:22.969235 4898 generic.go:334] "Generic (PLEG): container finished" podID="45af301d-29c9-474d-be0d-4d91f6d0cb18" containerID="686c9d5260a554140660fb899d995f27d4b2bd420d76c710ada3057e3122cfaf" exitCode=0 Mar 13 14:21:22 crc kubenswrapper[4898]: I0313 14:21:22.969287 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" event={"ID":"45af301d-29c9-474d-be0d-4d91f6d0cb18","Type":"ContainerDied","Data":"686c9d5260a554140660fb899d995f27d4b2bd420d76c710ada3057e3122cfaf"} Mar 13 14:21:25 crc kubenswrapper[4898]: I0313 14:21:25.231827 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 14:21:25 crc kubenswrapper[4898]: I0313 14:21:25.234805 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 14:21:25 crc kubenswrapper[4898]: I0313 14:21:25.243251 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 14:21:25 crc kubenswrapper[4898]: I0313 14:21:25.255471 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 14:21:28 crc kubenswrapper[4898]: I0313 14:21:28.036872 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7ldc" event={"ID":"b38f3681-6f2f-437f-9694-810d43921aa2","Type":"ContainerStarted","Data":"3cb8ede7b2e1e9a6a6b4976b023c84903fe921e5d4e530d62aef69fe59b03a0a"} Mar 13 14:21:28 crc kubenswrapper[4898]: I0313 14:21:28.067941 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z7ldc" podStartSLOduration=8.270604717 podStartE2EDuration="19.06792003s" podCreationTimestamp="2026-03-13 14:21:09 +0000 UTC" firstStartedPulling="2026-03-13 14:21:11.620929625 +0000 UTC m=+1506.622517864" lastFinishedPulling="2026-03-13 14:21:22.418244948 +0000 UTC m=+1517.419833177" observedRunningTime="2026-03-13 14:21:28.0636582 +0000 UTC m=+1523.065246459" watchObservedRunningTime="2026-03-13 14:21:28.06792003 +0000 UTC m=+1523.069508269" Mar 13 14:21:28 crc kubenswrapper[4898]: W0313 14:21:28.640954 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd4bf680_c8b7_4721_9595_9a8ed40410d2.slice/crio-19f1a84b67106ffd000b350d027a716a224702ac470d9c975f0fe16d4891e019 WatchSource:0}: Error finding container 19f1a84b67106ffd000b350d027a716a224702ac470d9c975f0fe16d4891e019: Status 404 returned error can't find the container with id 19f1a84b67106ffd000b350d027a716a224702ac470d9c975f0fe16d4891e019 Mar 13 14:21:29 crc kubenswrapper[4898]: I0313 14:21:29.053226 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b9dc95d4b-bvhlz" event={"ID":"fd4bf680-c8b7-4721-9595-9a8ed40410d2","Type":"ContainerStarted","Data":"19f1a84b67106ffd000b350d027a716a224702ac470d9c975f0fe16d4891e019"} Mar 13 14:21:29 crc kubenswrapper[4898]: I0313 14:21:29.522067 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:29 crc kubenswrapper[4898]: I0313 14:21:29.522426 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:30 crc kubenswrapper[4898]: I0313 14:21:30.068915 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d74d977fd-v5m5s" event={"ID":"7dcea9de-db8a-42dd-958c-59df43a49ff3","Type":"ContainerStarted","Data":"e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d"} Mar 13 14:21:30 crc kubenswrapper[4898]: I0313 14:21:30.098455 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d74d977fd-v5m5s" podStartSLOduration=12.098435932 podStartE2EDuration="12.098435932s" podCreationTimestamp="2026-03-13 14:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:30.093451213 +0000 UTC m=+1525.095039462" watchObservedRunningTime="2026-03-13 14:21:30.098435932 +0000 UTC m=+1525.100024171" Mar 13 14:21:30 crc kubenswrapper[4898]: E0313 14:21:30.449418 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" Mar 13 14:21:30 crc kubenswrapper[4898]: I0313 14:21:30.598882 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" probeResult="failure" output=< Mar 13 14:21:30 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:21:30 crc kubenswrapper[4898]: > Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.082989 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b9dc95d4b-bvhlz" event={"ID":"fd4bf680-c8b7-4721-9595-9a8ed40410d2","Type":"ContainerStarted","Data":"c626c10cd8100c4d3677f5d89f5e1aa068944a1acb09f9ce367478b3bbe23a6d"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.083529 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b9dc95d4b-bvhlz" event={"ID":"fd4bf680-c8b7-4721-9595-9a8ed40410d2","Type":"ContainerStarted","Data":"a7d90cf8e037e0421e0a6825fa627d8295d1c48cda92e5ed34783a1fd67ad1e5"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.086015 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.086054 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.089344 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-795749dc8c-sm2hl" event={"ID":"8b16e588-d353-4100-b143-b84420c42e30","Type":"ContainerStarted","Data":"d22f944ac54c3ae0fe8ec3e0ace87927fe4fb2405e1116807ca8c64712a38e97"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.089423 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-795749dc8c-sm2hl" event={"ID":"8b16e588-d353-4100-b143-b84420c42e30","Type":"ContainerStarted","Data":"e2eac0cc0a49aecc401f83d678bdc6912eb6f763fddf368e17e836b67cdbc37a"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.092994 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" event={"ID":"45af301d-29c9-474d-be0d-4d91f6d0cb18","Type":"ContainerStarted","Data":"6584acdbfa3b269b10be5eacbee652dc5b87853d5dd4647683e70850466d55d5"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.093168 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.096069 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zgt75" event={"ID":"84a7fd24-4320-4c0e-8ded-0d455252a549","Type":"ContainerStarted","Data":"213db1fb491a2ed6d8dc3d15759978456b725ffef29e1be38661bf279db1daf8"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.098829 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ztp6c" event={"ID":"193b05da-acb9-4512-a2ae-6c03450e6f05","Type":"ContainerStarted","Data":"c911be7c2d6d8f32598481ced3a29ce9fc65efe653b0696b937918e79b814d51"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.101543 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" event={"ID":"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2","Type":"ContainerStarted","Data":"7832e82576d1106230e9821b7a44384ec0152369113aeae90ea86d232dae5b4e"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.101579 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" event={"ID":"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2","Type":"ContainerStarted","Data":"5a677d66116ea7f14947dff0fd6386945680544d68635c360dac0097cbf6589a"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.105666 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"247749ae-204b-4e9c-ad1c-f5d924b6f211","Type":"ContainerStarted","Data":"8699a974ed046c0e546ec06ad74b2baeebb668b44968353911ae1775a60f7c87"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.105707 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.105735 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="ceilometer-notification-agent" containerID="cri-o://9dee453ab58c346884f20f75be928701596b9ae0bdbbb025979e1e2daaa907c1" gracePeriod=30 Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.105822 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="proxy-httpd" containerID="cri-o://8699a974ed046c0e546ec06ad74b2baeebb668b44968353911ae1775a60f7c87" gracePeriod=30 Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.105762 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.105971 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="sg-core" containerID="cri-o://54431181e3eb3b359d0852277dd7b5d798e5ebd64616fc6928567613ce28f709" gracePeriod=30 Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.106005 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.132368 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-b9dc95d4b-bvhlz" podStartSLOduration=10.132339581 podStartE2EDuration="10.132339581s" podCreationTimestamp="2026-03-13 14:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:31.11151777 +0000 UTC m=+1526.113106009" watchObservedRunningTime="2026-03-13 14:21:31.132339581 +0000 UTC m=+1526.133927820" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.156961 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-ztp6c" podStartSLOduration=3.583169798 podStartE2EDuration="56.156925119s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="2026-03-13 14:20:37.31855512 +0000 UTC m=+1472.320143369" lastFinishedPulling="2026-03-13 14:21:29.892310441 +0000 UTC m=+1524.893898690" observedRunningTime="2026-03-13 14:21:31.144265 +0000 UTC m=+1526.145853239" watchObservedRunningTime="2026-03-13 14:21:31.156925119 +0000 UTC m=+1526.158513348" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.178761 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" podStartSLOduration=13.178735275 podStartE2EDuration="13.178735275s" podCreationTimestamp="2026-03-13 14:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:31.166418566 +0000 UTC m=+1526.168006805" watchObservedRunningTime="2026-03-13 14:21:31.178735275 +0000 UTC m=+1526.180323514" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.202634 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-zgt75" podStartSLOduration=3.215469283 podStartE2EDuration="56.202609775s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="2026-03-13 14:20:37.161878893 +0000 UTC m=+1472.163467132" lastFinishedPulling="2026-03-13 14:21:30.149019385 +0000 UTC m=+1525.150607624" observedRunningTime="2026-03-13 14:21:31.192706418 +0000 UTC m=+1526.194294677" watchObservedRunningTime="2026-03-13 14:21:31.202609775 +0000 UTC m=+1526.204198014" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.224095 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-795749dc8c-sm2hl" podStartSLOduration=3.313369985 podStartE2EDuration="14.224066532s" podCreationTimestamp="2026-03-13 14:21:17 +0000 UTC" firstStartedPulling="2026-03-13 14:21:19.001986453 +0000 UTC m=+1514.003574692" lastFinishedPulling="2026-03-13 14:21:29.912683 +0000 UTC m=+1524.914271239" observedRunningTime="2026-03-13 14:21:31.214386751 +0000 UTC m=+1526.215974990" watchObservedRunningTime="2026-03-13 14:21:31.224066532 +0000 UTC m=+1526.225654771" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.290939 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" podStartSLOduration=2.452733054 podStartE2EDuration="13.290850086s" podCreationTimestamp="2026-03-13 14:21:18 +0000 UTC" firstStartedPulling="2026-03-13 14:21:19.32003864 +0000 UTC m=+1514.321626879" lastFinishedPulling="2026-03-13 14:21:30.158155672 +0000 UTC m=+1525.159743911" observedRunningTime="2026-03-13 14:21:31.282721595 +0000 UTC m=+1526.284309834" watchObservedRunningTime="2026-03-13 14:21:31.290850086 +0000 UTC m=+1526.292438345" Mar 13 14:21:32 crc kubenswrapper[4898]: I0313 14:21:32.119945 4898 generic.go:334] "Generic (PLEG): container finished" podID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerID="8699a974ed046c0e546ec06ad74b2baeebb668b44968353911ae1775a60f7c87" exitCode=0 Mar 13 14:21:32 crc kubenswrapper[4898]: I0313 14:21:32.119988 4898 generic.go:334] "Generic (PLEG): container finished" podID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerID="54431181e3eb3b359d0852277dd7b5d798e5ebd64616fc6928567613ce28f709" exitCode=2 Mar 13 14:21:32 crc kubenswrapper[4898]: I0313 14:21:32.119999 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"247749ae-204b-4e9c-ad1c-f5d924b6f211","Type":"ContainerDied","Data":"8699a974ed046c0e546ec06ad74b2baeebb668b44968353911ae1775a60f7c87"} Mar 13 14:21:32 crc kubenswrapper[4898]: I0313 14:21:32.120072 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"247749ae-204b-4e9c-ad1c-f5d924b6f211","Type":"ContainerDied","Data":"54431181e3eb3b359d0852277dd7b5d798e5ebd64616fc6928567613ce28f709"} Mar 13 14:21:33 crc kubenswrapper[4898]: I0313 14:21:33.134907 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 14:21:33 crc kubenswrapper[4898]: I0313 14:21:33.588964 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.164505 4898 generic.go:334] "Generic (PLEG): container finished" podID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerID="9dee453ab58c346884f20f75be928701596b9ae0bdbbb025979e1e2daaa907c1" exitCode=0 Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.164888 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"247749ae-204b-4e9c-ad1c-f5d924b6f211","Type":"ContainerDied","Data":"9dee453ab58c346884f20f75be928701596b9ae0bdbbb025979e1e2daaa907c1"} Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.165497 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"247749ae-204b-4e9c-ad1c-f5d924b6f211","Type":"ContainerDied","Data":"5fd3c6beb630c0a00d78ffbb0eaf96e2717f61918bd632f3618c99bd721d1714"} Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.165525 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fd3c6beb630c0a00d78ffbb0eaf96e2717f61918bd632f3618c99bd721d1714" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.271686 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.339411 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-config-data\") pod \"247749ae-204b-4e9c-ad1c-f5d924b6f211\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.339732 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffnxt\" (UniqueName: \"kubernetes.io/projected/247749ae-204b-4e9c-ad1c-f5d924b6f211-kube-api-access-ffnxt\") pod \"247749ae-204b-4e9c-ad1c-f5d924b6f211\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.339853 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-run-httpd\") pod \"247749ae-204b-4e9c-ad1c-f5d924b6f211\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.340021 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-sg-core-conf-yaml\") pod \"247749ae-204b-4e9c-ad1c-f5d924b6f211\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.340130 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-scripts\") pod \"247749ae-204b-4e9c-ad1c-f5d924b6f211\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.340317 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-combined-ca-bundle\") pod \"247749ae-204b-4e9c-ad1c-f5d924b6f211\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.340421 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-log-httpd\") pod \"247749ae-204b-4e9c-ad1c-f5d924b6f211\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.340332 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "247749ae-204b-4e9c-ad1c-f5d924b6f211" (UID: "247749ae-204b-4e9c-ad1c-f5d924b6f211"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.341347 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "247749ae-204b-4e9c-ad1c-f5d924b6f211" (UID: "247749ae-204b-4e9c-ad1c-f5d924b6f211"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.345867 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/247749ae-204b-4e9c-ad1c-f5d924b6f211-kube-api-access-ffnxt" (OuterVolumeSpecName: "kube-api-access-ffnxt") pod "247749ae-204b-4e9c-ad1c-f5d924b6f211" (UID: "247749ae-204b-4e9c-ad1c-f5d924b6f211"). InnerVolumeSpecName "kube-api-access-ffnxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.361160 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-scripts" (OuterVolumeSpecName: "scripts") pod "247749ae-204b-4e9c-ad1c-f5d924b6f211" (UID: "247749ae-204b-4e9c-ad1c-f5d924b6f211"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.395868 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "247749ae-204b-4e9c-ad1c-f5d924b6f211" (UID: "247749ae-204b-4e9c-ad1c-f5d924b6f211"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.425432 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "247749ae-204b-4e9c-ad1c-f5d924b6f211" (UID: "247749ae-204b-4e9c-ad1c-f5d924b6f211"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.444462 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.444503 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.444513 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.444521 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.444530 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffnxt\" (UniqueName: \"kubernetes.io/projected/247749ae-204b-4e9c-ad1c-f5d924b6f211-kube-api-access-ffnxt\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.444541 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.475784 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-config-data" (OuterVolumeSpecName: "config-data") pod "247749ae-204b-4e9c-ad1c-f5d924b6f211" (UID: "247749ae-204b-4e9c-ad1c-f5d924b6f211"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.546346 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.098557 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.194543 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.298169 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.323953 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.342299 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:21:36 crc kubenswrapper[4898]: E0313 14:21:36.342926 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="sg-core" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.342951 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="sg-core" Mar 13 14:21:36 crc kubenswrapper[4898]: E0313 14:21:36.342990 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="proxy-httpd" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.342999 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="proxy-httpd" Mar 13 14:21:36 crc kubenswrapper[4898]: E0313 14:21:36.343033 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="ceilometer-notification-agent" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.343043 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="ceilometer-notification-agent" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.343296 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="ceilometer-notification-agent" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.343333 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="sg-core" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.343364 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="proxy-httpd" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.345592 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.348385 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.348396 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.359270 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.474285 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-run-httpd\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.474439 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tvsh\" (UniqueName: \"kubernetes.io/projected/86c6c495-884b-4c92-949f-0159eb17e6a5-kube-api-access-8tvsh\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.474496 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-log-httpd\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.474538 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.474608 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.474689 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-scripts\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.474728 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-config-data\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.576470 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-config-data\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.576605 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-run-httpd\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.576711 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tvsh\" (UniqueName: \"kubernetes.io/projected/86c6c495-884b-4c92-949f-0159eb17e6a5-kube-api-access-8tvsh\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.576747 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-log-httpd\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.576781 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.576810 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.576874 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-scripts\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.577732 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-run-httpd\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.579920 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-log-httpd\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.583117 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.588653 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-scripts\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.589162 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-config-data\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.589720 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.611189 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tvsh\" (UniqueName: \"kubernetes.io/projected/86c6c495-884b-4c92-949f-0159eb17e6a5-kube-api-access-8tvsh\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.674163 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.049771 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.331694 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8495ffcdcc-j7d29"] Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.332092 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8495ffcdcc-j7d29" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-api" containerID="cri-o://0d21f5d009c2fbb3d9136543ddc9edaf66018231eb09d0ea5bf4fa35c2144f9b" gracePeriod=30 Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.332820 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8495ffcdcc-j7d29" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-httpd" containerID="cri-o://0cd127d8d8dce19d301ba8f94cb9ff0fc6150499598490bd11d503771e98d4fc" gracePeriod=30 Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.366619 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.396971 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-776df44c77-g64lv"] Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.400757 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.410923 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-776df44c77-g64lv"] Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.441884 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-8495ffcdcc-j7d29" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.205:9696/\": read tcp 10.217.0.2:55216->10.217.0.205:9696: read: connection reset by peer" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.516786 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-ovndb-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.516844 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-httpd-config\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.516910 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-internal-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.516944 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-public-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.517036 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-combined-ca-bundle\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.517063 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-config\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.517088 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d79cj\" (UniqueName: \"kubernetes.io/projected/4a679fb4-8d85-4835-a048-08c4b61aa158-kube-api-access-d79cj\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.619381 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-public-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.619529 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-combined-ca-bundle\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.619570 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-config\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.619602 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d79cj\" (UniqueName: \"kubernetes.io/projected/4a679fb4-8d85-4835-a048-08c4b61aa158-kube-api-access-d79cj\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.619665 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-ovndb-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.619695 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-httpd-config\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.619741 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-internal-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.628881 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-internal-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.630558 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-ovndb-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.631397 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-combined-ca-bundle\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.631731 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-public-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.633072 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-httpd-config\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.633646 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-config\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.647874 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d79cj\" (UniqueName: \"kubernetes.io/projected/4a679fb4-8d85-4835-a048-08c4b61aa158-kube-api-access-d79cj\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.763809 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.764475 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" path="/var/lib/kubelet/pods/247749ae-204b-4e9c-ad1c-f5d924b6f211/volumes" Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.256062 4898 generic.go:334] "Generic (PLEG): container finished" podID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerID="0cd127d8d8dce19d301ba8f94cb9ff0fc6150499598490bd11d503771e98d4fc" exitCode=0 Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.256127 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8495ffcdcc-j7d29" event={"ID":"194cc0b9-5fb1-492c-9df1-002f629cfb90","Type":"ContainerDied","Data":"0cd127d8d8dce19d301ba8f94cb9ff0fc6150499598490bd11d503771e98d4fc"} Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.278180 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerStarted","Data":"2b43c112fe6b642ffc81d63835d5208293191491d23e2c20e5ef660540956b7c"} Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.280987 4898 generic.go:334] "Generic (PLEG): container finished" podID="84a7fd24-4320-4c0e-8ded-0d455252a549" containerID="213db1fb491a2ed6d8dc3d15759978456b725ffef29e1be38661bf279db1daf8" exitCode=0 Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.281053 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zgt75" event={"ID":"84a7fd24-4320-4c0e-8ded-0d455252a549","Type":"ContainerDied","Data":"213db1fb491a2ed6d8dc3d15759978456b725ffef29e1be38661bf279db1daf8"} Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.472209 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.655629 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7gvqf"] Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.666115 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" podUID="da9c8289-b4cc-4259-a94e-fab15f437c67" containerName="dnsmasq-dns" containerID="cri-o://ec23679d2099538d875f32f1740477e86fa4744d0786a72fbd40a7500fbf13f8" gracePeriod=10 Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.700977 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-776df44c77-g64lv"] Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.322075 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-776df44c77-g64lv" event={"ID":"4a679fb4-8d85-4835-a048-08c4b61aa158","Type":"ContainerStarted","Data":"de506e417eaaea0c36429e77dfadc3b7be94f0b45c620428caf6a91bf38f1094"} Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.322690 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-776df44c77-g64lv" event={"ID":"4a679fb4-8d85-4835-a048-08c4b61aa158","Type":"ContainerStarted","Data":"b15282fd3af93bf6b4e8c3ab1dabdabfdbf27142041e4ad1d14fe6b0ae8e170d"} Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.347318 4898 generic.go:334] "Generic (PLEG): container finished" podID="da9c8289-b4cc-4259-a94e-fab15f437c67" containerID="ec23679d2099538d875f32f1740477e86fa4744d0786a72fbd40a7500fbf13f8" exitCode=0 Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.347435 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" event={"ID":"da9c8289-b4cc-4259-a94e-fab15f437c67","Type":"ContainerDied","Data":"ec23679d2099538d875f32f1740477e86fa4744d0786a72fbd40a7500fbf13f8"} Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.360062 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerStarted","Data":"b62d7bd0ca3497c43d915b6212935946bc82ac1a3defe8c89eeb3779d6ce9770"} Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.452602 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.477395 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-svc\") pod \"da9c8289-b4cc-4259-a94e-fab15f437c67\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.477682 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5758\" (UniqueName: \"kubernetes.io/projected/da9c8289-b4cc-4259-a94e-fab15f437c67-kube-api-access-q5758\") pod \"da9c8289-b4cc-4259-a94e-fab15f437c67\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.477799 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-swift-storage-0\") pod \"da9c8289-b4cc-4259-a94e-fab15f437c67\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.477849 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-config\") pod \"da9c8289-b4cc-4259-a94e-fab15f437c67\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.481154 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-sb\") pod \"da9c8289-b4cc-4259-a94e-fab15f437c67\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.481236 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-nb\") pod \"da9c8289-b4cc-4259-a94e-fab15f437c67\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.485979 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da9c8289-b4cc-4259-a94e-fab15f437c67-kube-api-access-q5758" (OuterVolumeSpecName: "kube-api-access-q5758") pod "da9c8289-b4cc-4259-a94e-fab15f437c67" (UID: "da9c8289-b4cc-4259-a94e-fab15f437c67"). InnerVolumeSpecName "kube-api-access-q5758". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.599795 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5758\" (UniqueName: \"kubernetes.io/projected/da9c8289-b4cc-4259-a94e-fab15f437c67-kube-api-access-q5758\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.609569 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-8495ffcdcc-j7d29" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.205:9696/\": dial tcp 10.217.0.205:9696: connect: connection refused" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.665563 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "da9c8289-b4cc-4259-a94e-fab15f437c67" (UID: "da9c8289-b4cc-4259-a94e-fab15f437c67"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.678442 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da9c8289-b4cc-4259-a94e-fab15f437c67" (UID: "da9c8289-b4cc-4259-a94e-fab15f437c67"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.726228 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.726272 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.766630 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da9c8289-b4cc-4259-a94e-fab15f437c67" (UID: "da9c8289-b4cc-4259-a94e-fab15f437c67"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.769701 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da9c8289-b4cc-4259-a94e-fab15f437c67" (UID: "da9c8289-b4cc-4259-a94e-fab15f437c67"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.828952 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.828984 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.915190 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-config" (OuterVolumeSpecName: "config") pod "da9c8289-b4cc-4259-a94e-fab15f437c67" (UID: "da9c8289-b4cc-4259-a94e-fab15f437c67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.933921 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.232768 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zgt75" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.257636 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.348819 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28mft\" (UniqueName: \"kubernetes.io/projected/84a7fd24-4320-4c0e-8ded-0d455252a549-kube-api-access-28mft\") pod \"84a7fd24-4320-4c0e-8ded-0d455252a549\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.348891 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-combined-ca-bundle\") pod \"84a7fd24-4320-4c0e-8ded-0d455252a549\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.349023 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-config-data\") pod \"84a7fd24-4320-4c0e-8ded-0d455252a549\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.490134 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84a7fd24-4320-4c0e-8ded-0d455252a549" (UID: "84a7fd24-4320-4c0e-8ded-0d455252a549"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.497098 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" event={"ID":"da9c8289-b4cc-4259-a94e-fab15f437c67","Type":"ContainerDied","Data":"2e5e30a7ea9bdd40efe150ba12afa0ed754dfac9e9a1ebb3308dac4a35e5fade"} Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.497403 4898 scope.go:117] "RemoveContainer" containerID="ec23679d2099538d875f32f1740477e86fa4744d0786a72fbd40a7500fbf13f8" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.497647 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.499130 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84a7fd24-4320-4c0e-8ded-0d455252a549-kube-api-access-28mft" (OuterVolumeSpecName: "kube-api-access-28mft") pod "84a7fd24-4320-4c0e-8ded-0d455252a549" (UID: "84a7fd24-4320-4c0e-8ded-0d455252a549"). InnerVolumeSpecName "kube-api-access-28mft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.525459 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zgt75" event={"ID":"84a7fd24-4320-4c0e-8ded-0d455252a549","Type":"ContainerDied","Data":"ff8a73d5234eb1ed4542baaf925a5bae9eff511012c73d58ac5c330a7c07d613"} Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.525537 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff8a73d5234eb1ed4542baaf925a5bae9eff511012c73d58ac5c330a7c07d613" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.525659 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zgt75" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.562840 4898 generic.go:334] "Generic (PLEG): container finished" podID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerID="0d21f5d009c2fbb3d9136543ddc9edaf66018231eb09d0ea5bf4fa35c2144f9b" exitCode=0 Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.563581 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8495ffcdcc-j7d29" event={"ID":"194cc0b9-5fb1-492c-9df1-002f629cfb90","Type":"ContainerDied","Data":"0d21f5d009c2fbb3d9136543ddc9edaf66018231eb09d0ea5bf4fa35c2144f9b"} Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.566776 4898 scope.go:117] "RemoveContainer" containerID="ebec6588ea54fc0e12abfc618cba17e32b0384e26af2c4dc5438fd6e04229c34" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.582041 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28mft\" (UniqueName: \"kubernetes.io/projected/84a7fd24-4320-4c0e-8ded-0d455252a549-kube-api-access-28mft\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.582087 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.589382 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7gvqf"] Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.593762 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-776df44c77-g64lv" event={"ID":"4a679fb4-8d85-4835-a048-08c4b61aa158","Type":"ContainerStarted","Data":"3df3fcd326f6b287405d8ae708899fcbf8ba1334da4351d589c918d2c483aab6"} Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.595592 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.643825 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7gvqf"] Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.652053 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" probeResult="failure" output=< Mar 13 14:21:40 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:21:40 crc kubenswrapper[4898]: > Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.657272 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-776df44c77-g64lv" podStartSLOduration=3.657255404 podStartE2EDuration="3.657255404s" podCreationTimestamp="2026-03-13 14:21:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:40.619815242 +0000 UTC m=+1535.621403491" watchObservedRunningTime="2026-03-13 14:21:40.657255404 +0000 UTC m=+1535.658843633" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.699190 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.791923 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-config-data" (OuterVolumeSpecName: "config-data") pod "84a7fd24-4320-4c0e-8ded-0d455252a549" (UID: "84a7fd24-4320-4c0e-8ded-0d455252a549"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.801032 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.815924 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d74d977fd-v5m5s"] Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.816142 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d74d977fd-v5m5s" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api-log" containerID="cri-o://8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221" gracePeriod=30 Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.816633 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d74d977fd-v5m5s" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api" containerID="cri-o://e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d" gracePeriod=30 Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.056797 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.212757 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-httpd-config\") pod \"194cc0b9-5fb1-492c-9df1-002f629cfb90\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.212845 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-public-tls-certs\") pod \"194cc0b9-5fb1-492c-9df1-002f629cfb90\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.213140 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-config\") pod \"194cc0b9-5fb1-492c-9df1-002f629cfb90\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.213210 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-internal-tls-certs\") pod \"194cc0b9-5fb1-492c-9df1-002f629cfb90\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.213295 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-combined-ca-bundle\") pod \"194cc0b9-5fb1-492c-9df1-002f629cfb90\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.213375 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvf8x\" (UniqueName: \"kubernetes.io/projected/194cc0b9-5fb1-492c-9df1-002f629cfb90-kube-api-access-pvf8x\") pod \"194cc0b9-5fb1-492c-9df1-002f629cfb90\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.213538 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-ovndb-tls-certs\") pod \"194cc0b9-5fb1-492c-9df1-002f629cfb90\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.229663 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "194cc0b9-5fb1-492c-9df1-002f629cfb90" (UID: "194cc0b9-5fb1-492c-9df1-002f629cfb90"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.237643 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/194cc0b9-5fb1-492c-9df1-002f629cfb90-kube-api-access-pvf8x" (OuterVolumeSpecName: "kube-api-access-pvf8x") pod "194cc0b9-5fb1-492c-9df1-002f629cfb90" (UID: "194cc0b9-5fb1-492c-9df1-002f629cfb90"). InnerVolumeSpecName "kube-api-access-pvf8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.317881 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.317946 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvf8x\" (UniqueName: \"kubernetes.io/projected/194cc0b9-5fb1-492c-9df1-002f629cfb90-kube-api-access-pvf8x\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.330125 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "194cc0b9-5fb1-492c-9df1-002f629cfb90" (UID: "194cc0b9-5fb1-492c-9df1-002f629cfb90"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.333008 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "194cc0b9-5fb1-492c-9df1-002f629cfb90" (UID: "194cc0b9-5fb1-492c-9df1-002f629cfb90"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.363497 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "194cc0b9-5fb1-492c-9df1-002f629cfb90" (UID: "194cc0b9-5fb1-492c-9df1-002f629cfb90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.392129 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-config" (OuterVolumeSpecName: "config") pod "194cc0b9-5fb1-492c-9df1-002f629cfb90" (UID: "194cc0b9-5fb1-492c-9df1-002f629cfb90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.421881 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.422705 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.422729 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.422741 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.523076 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "194cc0b9-5fb1-492c-9df1-002f629cfb90" (UID: "194cc0b9-5fb1-492c-9df1-002f629cfb90"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.525048 4898 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.622522 4898 generic.go:334] "Generic (PLEG): container finished" podID="193b05da-acb9-4512-a2ae-6c03450e6f05" containerID="c911be7c2d6d8f32598481ced3a29ce9fc65efe653b0696b937918e79b814d51" exitCode=0 Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.622659 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ztp6c" event={"ID":"193b05da-acb9-4512-a2ae-6c03450e6f05","Type":"ContainerDied","Data":"c911be7c2d6d8f32598481ced3a29ce9fc65efe653b0696b937918e79b814d51"} Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.628105 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8495ffcdcc-j7d29" event={"ID":"194cc0b9-5fb1-492c-9df1-002f629cfb90","Type":"ContainerDied","Data":"675cba3392edee8aa8b36b03aeac2453edffb374fe9e8c521c269c0464cb1478"} Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.628177 4898 scope.go:117] "RemoveContainer" containerID="0cd127d8d8dce19d301ba8f94cb9ff0fc6150499598490bd11d503771e98d4fc" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.628335 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.667688 4898 generic.go:334] "Generic (PLEG): container finished" podID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerID="8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221" exitCode=143 Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.667756 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d74d977fd-v5m5s" event={"ID":"7dcea9de-db8a-42dd-958c-59df43a49ff3","Type":"ContainerDied","Data":"8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221"} Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.718333 4898 scope.go:117] "RemoveContainer" containerID="0d21f5d009c2fbb3d9136543ddc9edaf66018231eb09d0ea5bf4fa35c2144f9b" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.718725 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8495ffcdcc-j7d29"] Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.719800 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerStarted","Data":"ba94a825cfb36ee16c3e15907274f9276083ba448d310d471374f19c54cc116c"} Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.719947 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerStarted","Data":"cea1936f2758016544cbefa24e4ca686c3e33acfdaf019898c501a90320d0242"} Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.812254 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da9c8289-b4cc-4259-a94e-fab15f437c67" path="/var/lib/kubelet/pods/da9c8289-b4cc-4259-a94e-fab15f437c67/volumes" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.813620 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8495ffcdcc-j7d29"] Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.328302 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.406932 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-config-data\") pod \"193b05da-acb9-4512-a2ae-6c03450e6f05\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.407019 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-db-sync-config-data\") pod \"193b05da-acb9-4512-a2ae-6c03450e6f05\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.407166 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/193b05da-acb9-4512-a2ae-6c03450e6f05-etc-machine-id\") pod \"193b05da-acb9-4512-a2ae-6c03450e6f05\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.407195 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-scripts\") pod \"193b05da-acb9-4512-a2ae-6c03450e6f05\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.407222 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-combined-ca-bundle\") pod \"193b05da-acb9-4512-a2ae-6c03450e6f05\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.407247 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82hs7\" (UniqueName: \"kubernetes.io/projected/193b05da-acb9-4512-a2ae-6c03450e6f05-kube-api-access-82hs7\") pod \"193b05da-acb9-4512-a2ae-6c03450e6f05\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.407893 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/193b05da-acb9-4512-a2ae-6c03450e6f05-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "193b05da-acb9-4512-a2ae-6c03450e6f05" (UID: "193b05da-acb9-4512-a2ae-6c03450e6f05"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.415067 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-scripts" (OuterVolumeSpecName: "scripts") pod "193b05da-acb9-4512-a2ae-6c03450e6f05" (UID: "193b05da-acb9-4512-a2ae-6c03450e6f05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.419028 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "193b05da-acb9-4512-a2ae-6c03450e6f05" (UID: "193b05da-acb9-4512-a2ae-6c03450e6f05"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.428160 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/193b05da-acb9-4512-a2ae-6c03450e6f05-kube-api-access-82hs7" (OuterVolumeSpecName: "kube-api-access-82hs7") pod "193b05da-acb9-4512-a2ae-6c03450e6f05" (UID: "193b05da-acb9-4512-a2ae-6c03450e6f05"). InnerVolumeSpecName "kube-api-access-82hs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.465571 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "193b05da-acb9-4512-a2ae-6c03450e6f05" (UID: "193b05da-acb9-4512-a2ae-6c03450e6f05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.509779 4898 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/193b05da-acb9-4512-a2ae-6c03450e6f05-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.510133 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.510145 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.510154 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82hs7\" (UniqueName: \"kubernetes.io/projected/193b05da-acb9-4512-a2ae-6c03450e6f05-kube-api-access-82hs7\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.510165 4898 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.525025 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-config-data" (OuterVolumeSpecName: "config-data") pod "193b05da-acb9-4512-a2ae-6c03450e6f05" (UID: "193b05da-acb9-4512-a2ae-6c03450e6f05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.612667 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.762176 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" path="/var/lib/kubelet/pods/194cc0b9-5fb1-492c-9df1-002f629cfb90/volumes" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.766841 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerStarted","Data":"a55576c9a44e83505bf8757afc0e1e19424b4717e80f08c508a794c81f2cfdb0"} Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.769009 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.771691 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ztp6c" event={"ID":"193b05da-acb9-4512-a2ae-6c03450e6f05","Type":"ContainerDied","Data":"bfe3b6cf0e5928312929ea860aeb7b7f643553f3479a3beac4f364f3ff4502ae"} Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.771722 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfe3b6cf0e5928312929ea860aeb7b7f643553f3479a3beac4f364f3ff4502ae" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.771776 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.803456 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.017630371 podStartE2EDuration="7.803431187s" podCreationTimestamp="2026-03-13 14:21:36 +0000 UTC" firstStartedPulling="2026-03-13 14:21:37.372052012 +0000 UTC m=+1532.373640251" lastFinishedPulling="2026-03-13 14:21:43.157852838 +0000 UTC m=+1538.159441067" observedRunningTime="2026-03-13 14:21:43.797346459 +0000 UTC m=+1538.798934708" watchObservedRunningTime="2026-03-13 14:21:43.803431187 +0000 UTC m=+1538.805019426" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.949433 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:21:43 crc kubenswrapper[4898]: E0313 14:21:43.950219 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-api" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950246 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-api" Mar 13 14:21:43 crc kubenswrapper[4898]: E0313 14:21:43.950268 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84a7fd24-4320-4c0e-8ded-0d455252a549" containerName="heat-db-sync" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950279 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a7fd24-4320-4c0e-8ded-0d455252a549" containerName="heat-db-sync" Mar 13 14:21:43 crc kubenswrapper[4898]: E0313 14:21:43.950307 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-httpd" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950314 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-httpd" Mar 13 14:21:43 crc kubenswrapper[4898]: E0313 14:21:43.950340 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9c8289-b4cc-4259-a94e-fab15f437c67" containerName="init" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950359 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9c8289-b4cc-4259-a94e-fab15f437c67" containerName="init" Mar 13 14:21:43 crc kubenswrapper[4898]: E0313 14:21:43.950382 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="193b05da-acb9-4512-a2ae-6c03450e6f05" containerName="cinder-db-sync" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950389 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="193b05da-acb9-4512-a2ae-6c03450e6f05" containerName="cinder-db-sync" Mar 13 14:21:43 crc kubenswrapper[4898]: E0313 14:21:43.950407 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9c8289-b4cc-4259-a94e-fab15f437c67" containerName="dnsmasq-dns" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950414 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9c8289-b4cc-4259-a94e-fab15f437c67" containerName="dnsmasq-dns" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950708 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-api" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950734 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="84a7fd24-4320-4c0e-8ded-0d455252a549" containerName="heat-db-sync" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950747 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-httpd" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950767 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="da9c8289-b4cc-4259-a94e-fab15f437c67" containerName="dnsmasq-dns" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950779 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="193b05da-acb9-4512-a2ae-6c03450e6f05" containerName="cinder-db-sync" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.952514 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.956018 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-dcn2n" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.956314 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.956502 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.959697 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.988198 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.028007 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-scripts\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.028142 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.028199 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.028232 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.028339 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wbnw\" (UniqueName: \"kubernetes.io/projected/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-kube-api-access-8wbnw\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.028370 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.047108 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gtnnh"] Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.049988 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.083697 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gtnnh"] Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.173249 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.174385 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-config\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.174670 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.174732 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wbnw\" (UniqueName: \"kubernetes.io/projected/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-kube-api-access-8wbnw\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.174765 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.175007 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-scripts\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.175104 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktnr4\" (UniqueName: \"kubernetes.io/projected/99ea68d3-f555-4779-90d0-d1f136ddadd2-kube-api-access-ktnr4\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.175188 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.175293 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.175412 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.175454 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.175491 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.176029 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.218072 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.223785 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-scripts\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.254807 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wbnw\" (UniqueName: \"kubernetes.io/projected/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-kube-api-access-8wbnw\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.255573 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.286824 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.323749 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.323872 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.324001 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-config\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.324342 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.324714 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktnr4\" (UniqueName: \"kubernetes.io/projected/99ea68d3-f555-4779-90d0-d1f136ddadd2-kube-api-access-ktnr4\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.324814 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.327656 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.328127 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.329226 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.330090 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.330470 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-config\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.363772 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktnr4\" (UniqueName: \"kubernetes.io/projected/99ea68d3-f555-4779-90d0-d1f136ddadd2-kube-api-access-ktnr4\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.415137 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.417016 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.422284 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.445795 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.500068 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.552551 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.552690 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data-custom\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.552801 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.552835 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-scripts\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.552883 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwbrx\" (UniqueName: \"kubernetes.io/projected/cb7c4601-9945-444b-8a00-a671ce18bb1e-kube-api-access-zwbrx\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.552966 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb7c4601-9945-444b-8a00-a671ce18bb1e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.552992 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb7c4601-9945-444b-8a00-a671ce18bb1e-logs\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.583434 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.656174 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwbrx\" (UniqueName: \"kubernetes.io/projected/cb7c4601-9945-444b-8a00-a671ce18bb1e-kube-api-access-zwbrx\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.656291 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb7c4601-9945-444b-8a00-a671ce18bb1e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.656325 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb7c4601-9945-444b-8a00-a671ce18bb1e-logs\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.656416 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.656482 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data-custom\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.656587 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.656611 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-scripts\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.657918 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb7c4601-9945-444b-8a00-a671ce18bb1e-logs\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.658255 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb7c4601-9945-444b-8a00-a671ce18bb1e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.667295 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.667363 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-scripts\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.668870 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.670572 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data-custom\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.711708 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwbrx\" (UniqueName: \"kubernetes.io/projected/cb7c4601-9945-444b-8a00-a671ce18bb1e-kube-api-access-zwbrx\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.799447 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.040217 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d74d977fd-v5m5s" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.211:9311/healthcheck\": read tcp 10.217.0.2:52740->10.217.0.211:9311: read: connection reset by peer" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.040418 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d74d977fd-v5m5s" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.211:9311/healthcheck\": read tcp 10.217.0.2:52756->10.217.0.211:9311: read: connection reset by peer" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.251427 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gtnnh"] Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.439169 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:21:45 crc kubenswrapper[4898]: W0313 14:21:45.546749 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0be2003_4a0d_4740_9b84_ab16bb27d5bb.slice/crio-519cfe6e0c33250225df4155f05016fa2ba7c8a1bb229003e79f90a22978c180 WatchSource:0}: Error finding container 519cfe6e0c33250225df4155f05016fa2ba7c8a1bb229003e79f90a22978c180: Status 404 returned error can't find the container with id 519cfe6e0c33250225df4155f05016fa2ba7c8a1bb229003e79f90a22978c180 Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.868158 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.891085 4898 generic.go:334] "Generic (PLEG): container finished" podID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerID="e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d" exitCode=0 Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.891186 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.891212 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d74d977fd-v5m5s" event={"ID":"7dcea9de-db8a-42dd-958c-59df43a49ff3","Type":"ContainerDied","Data":"e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d"} Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.892208 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d74d977fd-v5m5s" event={"ID":"7dcea9de-db8a-42dd-958c-59df43a49ff3","Type":"ContainerDied","Data":"704f9e3ccfefb3b8a00bd2333c1274e405dc0d43a2c31a48365c27cab56dfc29"} Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.892234 4898 scope.go:117] "RemoveContainer" containerID="e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.905865 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.907541 4898 generic.go:334] "Generic (PLEG): container finished" podID="99ea68d3-f555-4779-90d0-d1f136ddadd2" containerID="6c42ee9c0a17acfdf5d9f3b6de5ee36bb640854b185b6dd5e7f1e7441cc93008" exitCode=0 Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.907629 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" event={"ID":"99ea68d3-f555-4779-90d0-d1f136ddadd2","Type":"ContainerDied","Data":"6c42ee9c0a17acfdf5d9f3b6de5ee36bb640854b185b6dd5e7f1e7441cc93008"} Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.907655 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" event={"ID":"99ea68d3-f555-4779-90d0-d1f136ddadd2","Type":"ContainerStarted","Data":"2f6cf6b2237006a47af92c80edb293fb5e39aa92cbe683d435727b4ad4952d2e"} Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.911640 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d0be2003-4a0d-4740-9b84-ab16bb27d5bb","Type":"ContainerStarted","Data":"519cfe6e0c33250225df4155f05016fa2ba7c8a1bb229003e79f90a22978c180"} Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.928856 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dcea9de-db8a-42dd-958c-59df43a49ff3-logs\") pod \"7dcea9de-db8a-42dd-958c-59df43a49ff3\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.929213 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data-custom\") pod \"7dcea9de-db8a-42dd-958c-59df43a49ff3\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.929268 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data\") pod \"7dcea9de-db8a-42dd-958c-59df43a49ff3\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.929432 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt6km\" (UniqueName: \"kubernetes.io/projected/7dcea9de-db8a-42dd-958c-59df43a49ff3-kube-api-access-vt6km\") pod \"7dcea9de-db8a-42dd-958c-59df43a49ff3\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.929559 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-combined-ca-bundle\") pod \"7dcea9de-db8a-42dd-958c-59df43a49ff3\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.929804 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dcea9de-db8a-42dd-958c-59df43a49ff3-logs" (OuterVolumeSpecName: "logs") pod "7dcea9de-db8a-42dd-958c-59df43a49ff3" (UID: "7dcea9de-db8a-42dd-958c-59df43a49ff3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.931527 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dcea9de-db8a-42dd-958c-59df43a49ff3-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.950714 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dcea9de-db8a-42dd-958c-59df43a49ff3-kube-api-access-vt6km" (OuterVolumeSpecName: "kube-api-access-vt6km") pod "7dcea9de-db8a-42dd-958c-59df43a49ff3" (UID: "7dcea9de-db8a-42dd-958c-59df43a49ff3"). InnerVolumeSpecName "kube-api-access-vt6km". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.966013 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7dcea9de-db8a-42dd-958c-59df43a49ff3" (UID: "7dcea9de-db8a-42dd-958c-59df43a49ff3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.969776 4898 scope.go:117] "RemoveContainer" containerID="8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.033669 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.033699 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt6km\" (UniqueName: \"kubernetes.io/projected/7dcea9de-db8a-42dd-958c-59df43a49ff3-kube-api-access-vt6km\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.039770 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dcea9de-db8a-42dd-958c-59df43a49ff3" (UID: "7dcea9de-db8a-42dd-958c-59df43a49ff3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.049161 4898 scope.go:117] "RemoveContainer" containerID="e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d" Mar 13 14:21:46 crc kubenswrapper[4898]: E0313 14:21:46.055173 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d\": container with ID starting with e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d not found: ID does not exist" containerID="e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.055240 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d"} err="failed to get container status \"e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d\": rpc error: code = NotFound desc = could not find container \"e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d\": container with ID starting with e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d not found: ID does not exist" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.055270 4898 scope.go:117] "RemoveContainer" containerID="8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221" Mar 13 14:21:46 crc kubenswrapper[4898]: E0313 14:21:46.060184 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221\": container with ID starting with 8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221 not found: ID does not exist" containerID="8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.060234 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221"} err="failed to get container status \"8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221\": rpc error: code = NotFound desc = could not find container \"8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221\": container with ID starting with 8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221 not found: ID does not exist" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.070299 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data" (OuterVolumeSpecName: "config-data") pod "7dcea9de-db8a-42dd-958c-59df43a49ff3" (UID: "7dcea9de-db8a-42dd-958c-59df43a49ff3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.143374 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.143433 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.312576 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d74d977fd-v5m5s"] Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.328182 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-d74d977fd-v5m5s"] Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.968199 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.968774 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" event={"ID":"99ea68d3-f555-4779-90d0-d1f136ddadd2","Type":"ContainerStarted","Data":"4a5c75faeafd5fd73d57b281c113bb58d89f329bfae70b866f45093f1de113f3"} Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.969776 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.979079 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb7c4601-9945-444b-8a00-a671ce18bb1e","Type":"ContainerStarted","Data":"b5239aec37b0d4b7e60e804b0a75fb330f40a9a58da7e482e13786d0363b1b93"} Mar 13 14:21:47 crc kubenswrapper[4898]: I0313 14:21:47.756102 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" path="/var/lib/kubelet/pods/7dcea9de-db8a-42dd-958c-59df43a49ff3/volumes" Mar 13 14:21:47 crc kubenswrapper[4898]: I0313 14:21:47.831891 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:47 crc kubenswrapper[4898]: I0313 14:21:47.858757 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" podStartSLOduration=4.858738161 podStartE2EDuration="4.858738161s" podCreationTimestamp="2026-03-13 14:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:47.008384696 +0000 UTC m=+1542.009972935" watchObservedRunningTime="2026-03-13 14:21:47.858738161 +0000 UTC m=+1542.860326400" Mar 13 14:21:47 crc kubenswrapper[4898]: I0313 14:21:47.960826 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.055256 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb7c4601-9945-444b-8a00-a671ce18bb1e","Type":"ContainerStarted","Data":"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c"} Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.256653 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-647f998784-xvcjw"] Mar 13 14:21:48 crc kubenswrapper[4898]: E0313 14:21:48.257167 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.257181 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api" Mar 13 14:21:48 crc kubenswrapper[4898]: E0313 14:21:48.257206 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api-log" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.257212 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api-log" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.257415 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.257441 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api-log" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.259581 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.281562 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-647f998784-xvcjw"] Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.453469 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-scripts\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.453766 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa7825b5-b19b-44bb-8d23-bb121e669780-logs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.453814 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms92j\" (UniqueName: \"kubernetes.io/projected/fa7825b5-b19b-44bb-8d23-bb121e669780-kube-api-access-ms92j\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.453966 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-internal-tls-certs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.454246 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-config-data\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.454333 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-combined-ca-bundle\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.454411 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-public-tls-certs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.557268 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-public-tls-certs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.558304 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-scripts\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.558375 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa7825b5-b19b-44bb-8d23-bb121e669780-logs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.558820 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa7825b5-b19b-44bb-8d23-bb121e669780-logs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.558942 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms92j\" (UniqueName: \"kubernetes.io/projected/fa7825b5-b19b-44bb-8d23-bb121e669780-kube-api-access-ms92j\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.559119 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-internal-tls-certs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.559167 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-config-data\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.559182 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-combined-ca-bundle\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.564418 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-scripts\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.564799 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-public-tls-certs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.567115 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-config-data\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.571593 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-internal-tls-certs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.571672 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-combined-ca-bundle\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.590001 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms92j\" (UniqueName: \"kubernetes.io/projected/fa7825b5-b19b-44bb-8d23-bb121e669780-kube-api-access-ms92j\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.635148 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.085435 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb7c4601-9945-444b-8a00-a671ce18bb1e","Type":"ContainerStarted","Data":"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323"} Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.085712 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerName="cinder-api-log" containerID="cri-o://d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c" gracePeriod=30 Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.086195 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerName="cinder-api" containerID="cri-o://6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323" gracePeriod=30 Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.086257 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.098790 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d0be2003-4a0d-4740-9b84-ab16bb27d5bb","Type":"ContainerStarted","Data":"bafd55f18270e2838b59b970f12a6aafedd22b09a08b2d2305599aac961e6911"} Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.098821 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d0be2003-4a0d-4740-9b84-ab16bb27d5bb","Type":"ContainerStarted","Data":"7f741df9b30455c96ea278c501c4d63ab1af4bf960776e0726cfee685819c6bd"} Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.119697 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.119668624 podStartE2EDuration="5.119668624s" podCreationTimestamp="2026-03-13 14:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:49.113110913 +0000 UTC m=+1544.114699172" watchObservedRunningTime="2026-03-13 14:21:49.119668624 +0000 UTC m=+1544.121256863" Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.139585 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.139651 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.163085 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.080350573 podStartE2EDuration="6.16305711s" podCreationTimestamp="2026-03-13 14:21:43 +0000 UTC" firstStartedPulling="2026-03-13 14:21:45.56477082 +0000 UTC m=+1540.566359059" lastFinishedPulling="2026-03-13 14:21:46.647477357 +0000 UTC m=+1541.649065596" observedRunningTime="2026-03-13 14:21:49.140479204 +0000 UTC m=+1544.142067463" watchObservedRunningTime="2026-03-13 14:21:49.16305711 +0000 UTC m=+1544.164645359" Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.319729 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-647f998784-xvcjw"] Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.584557 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.002370 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.110048 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-combined-ca-bundle\") pod \"cb7c4601-9945-444b-8a00-a671ce18bb1e\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.110092 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data-custom\") pod \"cb7c4601-9945-444b-8a00-a671ce18bb1e\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.110219 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-scripts\") pod \"cb7c4601-9945-444b-8a00-a671ce18bb1e\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.110279 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data\") pod \"cb7c4601-9945-444b-8a00-a671ce18bb1e\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.110514 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb7c4601-9945-444b-8a00-a671ce18bb1e-logs\") pod \"cb7c4601-9945-444b-8a00-a671ce18bb1e\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.110539 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb7c4601-9945-444b-8a00-a671ce18bb1e-etc-machine-id\") pod \"cb7c4601-9945-444b-8a00-a671ce18bb1e\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.110556 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwbrx\" (UniqueName: \"kubernetes.io/projected/cb7c4601-9945-444b-8a00-a671ce18bb1e-kube-api-access-zwbrx\") pod \"cb7c4601-9945-444b-8a00-a671ce18bb1e\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.113057 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb7c4601-9945-444b-8a00-a671ce18bb1e-logs" (OuterVolumeSpecName: "logs") pod "cb7c4601-9945-444b-8a00-a671ce18bb1e" (UID: "cb7c4601-9945-444b-8a00-a671ce18bb1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.113616 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7c4601-9945-444b-8a00-a671ce18bb1e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cb7c4601-9945-444b-8a00-a671ce18bb1e" (UID: "cb7c4601-9945-444b-8a00-a671ce18bb1e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.118103 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb7c4601-9945-444b-8a00-a671ce18bb1e-kube-api-access-zwbrx" (OuterVolumeSpecName: "kube-api-access-zwbrx") pod "cb7c4601-9945-444b-8a00-a671ce18bb1e" (UID: "cb7c4601-9945-444b-8a00-a671ce18bb1e"). InnerVolumeSpecName "kube-api-access-zwbrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.118585 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cb7c4601-9945-444b-8a00-a671ce18bb1e" (UID: "cb7c4601-9945-444b-8a00-a671ce18bb1e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.118870 4898 generic.go:334] "Generic (PLEG): container finished" podID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerID="6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323" exitCode=0 Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.119000 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb7c4601-9945-444b-8a00-a671ce18bb1e","Type":"ContainerDied","Data":"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323"} Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.119061 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.119082 4898 scope.go:117] "RemoveContainer" containerID="6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.119021 4898 generic.go:334] "Generic (PLEG): container finished" podID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerID="d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c" exitCode=143 Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.119066 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb7c4601-9945-444b-8a00-a671ce18bb1e","Type":"ContainerDied","Data":"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c"} Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.119738 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb7c4601-9945-444b-8a00-a671ce18bb1e","Type":"ContainerDied","Data":"b5239aec37b0d4b7e60e804b0a75fb330f40a9a58da7e482e13786d0363b1b93"} Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.124154 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-647f998784-xvcjw" event={"ID":"fa7825b5-b19b-44bb-8d23-bb121e669780","Type":"ContainerStarted","Data":"e7d8a6f5bcff44a554794a7336db9d2235c8a1237b68565c7974ac14bf58a726"} Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.124390 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-647f998784-xvcjw" event={"ID":"fa7825b5-b19b-44bb-8d23-bb121e669780","Type":"ContainerStarted","Data":"7c2e07ab88f2b898d60cd6d882f9de9a202887ed9045a7d1964343ba89d7b9b6"} Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.127224 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-scripts" (OuterVolumeSpecName: "scripts") pod "cb7c4601-9945-444b-8a00-a671ce18bb1e" (UID: "cb7c4601-9945-444b-8a00-a671ce18bb1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.144858 4898 scope.go:117] "RemoveContainer" containerID="d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.164614 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb7c4601-9945-444b-8a00-a671ce18bb1e" (UID: "cb7c4601-9945-444b-8a00-a671ce18bb1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.165708 4898 scope.go:117] "RemoveContainer" containerID="6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323" Mar 13 14:21:50 crc kubenswrapper[4898]: E0313 14:21:50.166227 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323\": container with ID starting with 6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323 not found: ID does not exist" containerID="6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.166285 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323"} err="failed to get container status \"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323\": rpc error: code = NotFound desc = could not find container \"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323\": container with ID starting with 6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323 not found: ID does not exist" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.166321 4898 scope.go:117] "RemoveContainer" containerID="d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c" Mar 13 14:21:50 crc kubenswrapper[4898]: E0313 14:21:50.166650 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c\": container with ID starting with d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c not found: ID does not exist" containerID="d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.166675 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c"} err="failed to get container status \"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c\": rpc error: code = NotFound desc = could not find container \"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c\": container with ID starting with d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c not found: ID does not exist" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.166690 4898 scope.go:117] "RemoveContainer" containerID="6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.166913 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323"} err="failed to get container status \"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323\": rpc error: code = NotFound desc = could not find container \"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323\": container with ID starting with 6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323 not found: ID does not exist" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.166941 4898 scope.go:117] "RemoveContainer" containerID="d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.167183 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c"} err="failed to get container status \"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c\": rpc error: code = NotFound desc = could not find container \"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c\": container with ID starting with d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c not found: ID does not exist" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.203402 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data" (OuterVolumeSpecName: "config-data") pod "cb7c4601-9945-444b-8a00-a671ce18bb1e" (UID: "cb7c4601-9945-444b-8a00-a671ce18bb1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.214369 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb7c4601-9945-444b-8a00-a671ce18bb1e-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.214791 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwbrx\" (UniqueName: \"kubernetes.io/projected/cb7c4601-9945-444b-8a00-a671ce18bb1e-kube-api-access-zwbrx\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.214807 4898 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb7c4601-9945-444b-8a00-a671ce18bb1e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.214823 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.214836 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.214848 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.214863 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.467089 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.489609 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.506450 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:50 crc kubenswrapper[4898]: E0313 14:21:50.507027 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerName="cinder-api-log" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.507047 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerName="cinder-api-log" Mar 13 14:21:50 crc kubenswrapper[4898]: E0313 14:21:50.507069 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerName="cinder-api" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.507076 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerName="cinder-api" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.507324 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerName="cinder-api-log" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.507360 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerName="cinder-api" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.508751 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.517397 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.532166 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.532367 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.532553 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.580271 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623449 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-config-data-custom\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623515 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcrhs\" (UniqueName: \"kubernetes.io/projected/bda33d23-490a-4099-954b-c613ab5d5c73-kube-api-access-hcrhs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623563 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623653 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-config-data\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623676 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bda33d23-490a-4099-954b-c613ab5d5c73-logs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623726 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-scripts\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623753 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-public-tls-certs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623802 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623879 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bda33d23-490a-4099-954b-c613ab5d5c73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.632088 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" probeResult="failure" output=< Mar 13 14:21:50 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:21:50 crc kubenswrapper[4898]: > Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725245 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-config-data\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725303 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bda33d23-490a-4099-954b-c613ab5d5c73-logs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725355 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-scripts\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725399 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-public-tls-certs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725453 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725583 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bda33d23-490a-4099-954b-c613ab5d5c73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725646 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-config-data-custom\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725673 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcrhs\" (UniqueName: \"kubernetes.io/projected/bda33d23-490a-4099-954b-c613ab5d5c73-kube-api-access-hcrhs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725710 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.734509 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bda33d23-490a-4099-954b-c613ab5d5c73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.735080 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bda33d23-490a-4099-954b-c613ab5d5c73-logs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.751481 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.753830 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-config-data-custom\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.767505 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcrhs\" (UniqueName: \"kubernetes.io/projected/bda33d23-490a-4099-954b-c613ab5d5c73-kube-api-access-hcrhs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.770001 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-config-data\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.774845 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-public-tls-certs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.776478 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-scripts\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.782287 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.882096 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.172702 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-647f998784-xvcjw" event={"ID":"fa7825b5-b19b-44bb-8d23-bb121e669780","Type":"ContainerStarted","Data":"d5182af4e2015471f641e75a51f4b3c088c6d29f03d19e8012d182b598de84fd"} Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.231631 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-647f998784-xvcjw" podStartSLOduration=3.231612419 podStartE2EDuration="3.231612419s" podCreationTimestamp="2026-03-13 14:21:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:51.22277733 +0000 UTC m=+1546.224365579" watchObservedRunningTime="2026-03-13 14:21:51.231612419 +0000 UTC m=+1546.233200658" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.648830 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.762377 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" path="/var/lib/kubelet/pods/cb7c4601-9945-444b-8a00-a671ce18bb1e/volumes" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.764090 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.768884 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.774369 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.774608 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.774855 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-zrf8c" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.790806 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.854384 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2b5z\" (UniqueName: \"kubernetes.io/projected/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-kube-api-access-m2b5z\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.854751 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-openstack-config\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.855017 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-openstack-config-secret\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.855322 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.956939 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-openstack-config\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.957015 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-openstack-config-secret\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.957123 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.957165 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2b5z\" (UniqueName: \"kubernetes.io/projected/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-kube-api-access-m2b5z\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.957926 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-openstack-config\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.963067 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-openstack-config-secret\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.973453 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.974104 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2b5z\" (UniqueName: \"kubernetes.io/projected/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-kube-api-access-m2b5z\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:52 crc kubenswrapper[4898]: I0313 14:21:52.105268 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 14:21:52 crc kubenswrapper[4898]: I0313 14:21:52.277334 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bda33d23-490a-4099-954b-c613ab5d5c73","Type":"ContainerStarted","Data":"262dcf4e15f8ce792ae3571e3d3a1ec83ec0fd3784f8077ccfb70ec4bfa3c297"} Mar 13 14:21:52 crc kubenswrapper[4898]: I0313 14:21:52.278375 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:52 crc kubenswrapper[4898]: I0313 14:21:52.278461 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:52 crc kubenswrapper[4898]: W0313 14:21:52.632327 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod124bd4ee_d9f0_408f_a46e_4d143e8ab02a.slice/crio-fdc74456375dc14a674a713f421a590760eba34f5944defa77d2619e4b04d312 WatchSource:0}: Error finding container fdc74456375dc14a674a713f421a590760eba34f5944defa77d2619e4b04d312: Status 404 returned error can't find the container with id fdc74456375dc14a674a713f421a590760eba34f5944defa77d2619e4b04d312 Mar 13 14:21:52 crc kubenswrapper[4898]: I0313 14:21:52.635557 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 14:21:53 crc kubenswrapper[4898]: I0313 14:21:53.293507 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bda33d23-490a-4099-954b-c613ab5d5c73","Type":"ContainerStarted","Data":"2c4982813ff025725e309ae1fb5c221c16834addb31306bfb34d1f38cc6f5b58"} Mar 13 14:21:53 crc kubenswrapper[4898]: I0313 14:21:53.300545 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"124bd4ee-d9f0-408f-a46e-4d143e8ab02a","Type":"ContainerStarted","Data":"fdc74456375dc14a674a713f421a590760eba34f5944defa77d2619e4b04d312"} Mar 13 14:21:54 crc kubenswrapper[4898]: I0313 14:21:54.308118 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bda33d23-490a-4099-954b-c613ab5d5c73","Type":"ContainerStarted","Data":"9df64f67ff0ac5343dcc1f672969f9ff840a1bd2e72ed128eee29135694d9522"} Mar 13 14:21:54 crc kubenswrapper[4898]: I0313 14:21:54.335350 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.33532748 podStartE2EDuration="4.33532748s" podCreationTimestamp="2026-03-13 14:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:54.328604905 +0000 UTC m=+1549.330193154" watchObservedRunningTime="2026-03-13 14:21:54.33532748 +0000 UTC m=+1549.336915709" Mar 13 14:21:54 crc kubenswrapper[4898]: I0313 14:21:54.450804 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:54 crc kubenswrapper[4898]: I0313 14:21:54.544201 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8d6wc"] Mar 13 14:21:54 crc kubenswrapper[4898]: I0313 14:21:54.544435 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" podUID="45af301d-29c9-474d-be0d-4d91f6d0cb18" containerName="dnsmasq-dns" containerID="cri-o://6584acdbfa3b269b10be5eacbee652dc5b87853d5dd4647683e70850466d55d5" gracePeriod=10 Mar 13 14:21:54 crc kubenswrapper[4898]: I0313 14:21:54.872038 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 14:21:54 crc kubenswrapper[4898]: I0313 14:21:54.931006 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:21:55 crc kubenswrapper[4898]: I0313 14:21:55.322143 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerName="cinder-scheduler" containerID="cri-o://7f741df9b30455c96ea278c501c4d63ab1af4bf960776e0726cfee685819c6bd" gracePeriod=30 Mar 13 14:21:55 crc kubenswrapper[4898]: I0313 14:21:55.322315 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerName="probe" containerID="cri-o://bafd55f18270e2838b59b970f12a6aafedd22b09a08b2d2305599aac961e6911" gracePeriod=30 Mar 13 14:21:55 crc kubenswrapper[4898]: I0313 14:21:55.323359 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 14:21:55 crc kubenswrapper[4898]: I0313 14:21:55.580667 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.349210 4898 generic.go:334] "Generic (PLEG): container finished" podID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerID="bafd55f18270e2838b59b970f12a6aafedd22b09a08b2d2305599aac961e6911" exitCode=0 Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.350768 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d0be2003-4a0d-4740-9b84-ab16bb27d5bb","Type":"ContainerDied","Data":"bafd55f18270e2838b59b970f12a6aafedd22b09a08b2d2305599aac961e6911"} Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.354842 4898 generic.go:334] "Generic (PLEG): container finished" podID="45af301d-29c9-474d-be0d-4d91f6d0cb18" containerID="6584acdbfa3b269b10be5eacbee652dc5b87853d5dd4647683e70850466d55d5" exitCode=0 Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.356186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" event={"ID":"45af301d-29c9-474d-be0d-4d91f6d0cb18","Type":"ContainerDied","Data":"6584acdbfa3b269b10be5eacbee652dc5b87853d5dd4647683e70850466d55d5"} Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.356211 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" event={"ID":"45af301d-29c9-474d-be0d-4d91f6d0cb18","Type":"ContainerDied","Data":"54b39114a6c35a297bd4da40e16643230d03c158b2784fe677b7a7dcda81e6ec"} Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.356220 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54b39114a6c35a297bd4da40e16643230d03c158b2784fe677b7a7dcda81e6ec" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.402364 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.588230 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-swift-storage-0\") pod \"45af301d-29c9-474d-be0d-4d91f6d0cb18\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.588714 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-svc\") pod \"45af301d-29c9-474d-be0d-4d91f6d0cb18\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.588760 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-nb\") pod \"45af301d-29c9-474d-be0d-4d91f6d0cb18\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.588787 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmplv\" (UniqueName: \"kubernetes.io/projected/45af301d-29c9-474d-be0d-4d91f6d0cb18-kube-api-access-hmplv\") pod \"45af301d-29c9-474d-be0d-4d91f6d0cb18\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.588843 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-sb\") pod \"45af301d-29c9-474d-be0d-4d91f6d0cb18\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.588860 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-config\") pod \"45af301d-29c9-474d-be0d-4d91f6d0cb18\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.596315 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45af301d-29c9-474d-be0d-4d91f6d0cb18-kube-api-access-hmplv" (OuterVolumeSpecName: "kube-api-access-hmplv") pod "45af301d-29c9-474d-be0d-4d91f6d0cb18" (UID: "45af301d-29c9-474d-be0d-4d91f6d0cb18"). InnerVolumeSpecName "kube-api-access-hmplv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.632654 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.670980 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45af301d-29c9-474d-be0d-4d91f6d0cb18" (UID: "45af301d-29c9-474d-be0d-4d91f6d0cb18"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.671528 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "45af301d-29c9-474d-be0d-4d91f6d0cb18" (UID: "45af301d-29c9-474d-be0d-4d91f6d0cb18"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.692347 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.692381 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmplv\" (UniqueName: \"kubernetes.io/projected/45af301d-29c9-474d-be0d-4d91f6d0cb18-kube-api-access-hmplv\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.692392 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.698416 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "45af301d-29c9-474d-be0d-4d91f6d0cb18" (UID: "45af301d-29c9-474d-be0d-4d91f6d0cb18"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.714817 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45af301d-29c9-474d-be0d-4d91f6d0cb18" (UID: "45af301d-29c9-474d-be0d-4d91f6d0cb18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.737832 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5bf5d8b7d4-4gwxr"] Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.743349 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5bf5d8b7d4-4gwxr" podUID="604a0205-6c18-4bff-929f-038524d62aeb" containerName="placement-log" containerID="cri-o://7d8e485964b16b478ba92c0abf89ef5c7fe78d1b940606e5a01f0ef264ccdb32" gracePeriod=30 Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.743475 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5bf5d8b7d4-4gwxr" podUID="604a0205-6c18-4bff-929f-038524d62aeb" containerName="placement-api" containerID="cri-o://974d31c9fc27c07800ab40a1409496611af9fce07ca6cd7d36cd6abbd352b819" gracePeriod=30 Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.753713 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-config" (OuterVolumeSpecName: "config") pod "45af301d-29c9-474d-be0d-4d91f6d0cb18" (UID: "45af301d-29c9-474d-be0d-4d91f6d0cb18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.794669 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.794695 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.794704 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:57 crc kubenswrapper[4898]: I0313 14:21:57.370434 4898 generic.go:334] "Generic (PLEG): container finished" podID="604a0205-6c18-4bff-929f-038524d62aeb" containerID="7d8e485964b16b478ba92c0abf89ef5c7fe78d1b940606e5a01f0ef264ccdb32" exitCode=143 Mar 13 14:21:57 crc kubenswrapper[4898]: I0313 14:21:57.370525 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:57 crc kubenswrapper[4898]: I0313 14:21:57.370809 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bf5d8b7d4-4gwxr" event={"ID":"604a0205-6c18-4bff-929f-038524d62aeb","Type":"ContainerDied","Data":"7d8e485964b16b478ba92c0abf89ef5c7fe78d1b940606e5a01f0ef264ccdb32"} Mar 13 14:21:57 crc kubenswrapper[4898]: I0313 14:21:57.424444 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8d6wc"] Mar 13 14:21:57 crc kubenswrapper[4898]: I0313 14:21:57.444801 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8d6wc"] Mar 13 14:21:57 crc kubenswrapper[4898]: I0313 14:21:57.780625 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45af301d-29c9-474d-be0d-4d91f6d0cb18" path="/var/lib/kubelet/pods/45af301d-29c9-474d-be0d-4d91f6d0cb18/volumes" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.255710 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7f9cbdc5df-5tx5z"] Mar 13 14:21:59 crc kubenswrapper[4898]: E0313 14:21:59.256504 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45af301d-29c9-474d-be0d-4d91f6d0cb18" containerName="dnsmasq-dns" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.256519 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="45af301d-29c9-474d-be0d-4d91f6d0cb18" containerName="dnsmasq-dns" Mar 13 14:21:59 crc kubenswrapper[4898]: E0313 14:21:59.256565 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45af301d-29c9-474d-be0d-4d91f6d0cb18" containerName="init" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.256572 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="45af301d-29c9-474d-be0d-4d91f6d0cb18" containerName="init" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.256805 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="45af301d-29c9-474d-be0d-4d91f6d0cb18" containerName="dnsmasq-dns" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.257943 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.261841 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.262497 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.262619 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.274640 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7f9cbdc5df-5tx5z"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.371131 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6b86699784-tf822"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.372682 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375412 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-public-tls-certs\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375480 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1a57db04-0dc9-4d63-8d08-dd4309b19496-etc-swift\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375601 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-combined-ca-bundle\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375659 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-config-data\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375706 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375745 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a57db04-0dc9-4d63-8d08-dd4309b19496-run-httpd\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375805 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-combined-ca-bundle\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375852 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a57db04-0dc9-4d63-8d08-dd4309b19496-log-httpd\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375877 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-internal-tls-certs\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375954 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snc8w\" (UniqueName: \"kubernetes.io/projected/88ab3ad2-782a-4c21-8104-1b80468dbca0-kube-api-access-snc8w\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.376106 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrpt2\" (UniqueName: \"kubernetes.io/projected/1a57db04-0dc9-4d63-8d08-dd4309b19496-kube-api-access-rrpt2\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.376279 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data-custom\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.378999 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-d5fsz" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.379263 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.379429 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.384576 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6b86699784-tf822"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482136 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-public-tls-certs\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482186 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1a57db04-0dc9-4d63-8d08-dd4309b19496-etc-swift\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482216 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-combined-ca-bundle\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482233 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-config-data\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482257 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482279 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a57db04-0dc9-4d63-8d08-dd4309b19496-run-httpd\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482310 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-combined-ca-bundle\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482332 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a57db04-0dc9-4d63-8d08-dd4309b19496-log-httpd\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482349 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-internal-tls-certs\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482378 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snc8w\" (UniqueName: \"kubernetes.io/projected/88ab3ad2-782a-4c21-8104-1b80468dbca0-kube-api-access-snc8w\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482440 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrpt2\" (UniqueName: \"kubernetes.io/projected/1a57db04-0dc9-4d63-8d08-dd4309b19496-kube-api-access-rrpt2\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482498 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data-custom\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.490194 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a57db04-0dc9-4d63-8d08-dd4309b19496-run-httpd\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.492075 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.492671 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data-custom\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.492776 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-public-tls-certs\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.494412 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a57db04-0dc9-4d63-8d08-dd4309b19496-log-httpd\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.496954 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xntfr"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.499953 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.516777 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-combined-ca-bundle\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.516864 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-config-data\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.517057 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xntfr"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.524007 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-combined-ca-bundle\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.538295 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-internal-tls-certs\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.539532 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snc8w\" (UniqueName: \"kubernetes.io/projected/88ab3ad2-782a-4c21-8104-1b80468dbca0-kube-api-access-snc8w\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.540467 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1a57db04-0dc9-4d63-8d08-dd4309b19496-etc-swift\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.542132 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrpt2\" (UniqueName: \"kubernetes.io/projected/1a57db04-0dc9-4d63-8d08-dd4309b19496-kube-api-access-rrpt2\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.584384 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-config\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.584929 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd2mk\" (UniqueName: \"kubernetes.io/projected/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-kube-api-access-zd2mk\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.584960 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.584977 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.585034 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.585055 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.606186 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.634494 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-9b6c99f6d-7zgm5"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.638320 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.645856 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.669091 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-9b6c99f6d-7zgm5"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.687854 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.687939 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.688037 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.688067 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.688138 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-config\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.689507 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd2mk\" (UniqueName: \"kubernetes.io/projected/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-kube-api-access-zd2mk\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.690505 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.691021 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.693854 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.693946 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-config\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.697146 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.700953 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-759d64ffd4-kzp67"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.702542 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.704641 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.708241 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.734035 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd2mk\" (UniqueName: \"kubernetes.io/projected/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-kube-api-access-zd2mk\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.737655 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-759d64ffd4-kzp67"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.792472 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data-custom\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.792678 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-combined-ca-bundle\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.792710 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.792726 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjv6n\" (UniqueName: \"kubernetes.io/projected/ad3d61d7-d777-4115-92c7-e4e3125c5260-kube-api-access-bjv6n\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.894774 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.894917 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg2l6\" (UniqueName: \"kubernetes.io/projected/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-kube-api-access-cg2l6\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.894986 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data-custom\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.895052 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-combined-ca-bundle\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.895101 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.895126 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjv6n\" (UniqueName: \"kubernetes.io/projected/ad3d61d7-d777-4115-92c7-e4e3125c5260-kube-api-access-bjv6n\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.895310 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data-custom\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.895603 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-combined-ca-bundle\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.902633 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.904038 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-combined-ca-bundle\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.911365 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data-custom\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.914369 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjv6n\" (UniqueName: \"kubernetes.io/projected/ad3d61d7-d777-4115-92c7-e4e3125c5260-kube-api-access-bjv6n\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.978539 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.997991 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg2l6\" (UniqueName: \"kubernetes.io/projected/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-kube-api-access-cg2l6\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.998106 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data-custom\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.998891 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.999210 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-combined-ca-bundle\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.999288 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.001582 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data-custom\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.003945 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-combined-ca-bundle\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.004268 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.019848 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg2l6\" (UniqueName: \"kubernetes.io/projected/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-kube-api-access-cg2l6\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.076312 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.148953 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556862-mpx4w"] Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.151238 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.167512 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.168991 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556862-mpx4w"] Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.169541 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.170052 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.205366 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tmtl\" (UniqueName: \"kubernetes.io/projected/3c85cb04-363e-45d6-a14b-79c249e8f469-kube-api-access-6tmtl\") pod \"auto-csr-approver-29556862-mpx4w\" (UID: \"3c85cb04-363e-45d6-a14b-79c249e8f469\") " pod="openshift-infra/auto-csr-approver-29556862-mpx4w" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.310866 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tmtl\" (UniqueName: \"kubernetes.io/projected/3c85cb04-363e-45d6-a14b-79c249e8f469-kube-api-access-6tmtl\") pod \"auto-csr-approver-29556862-mpx4w\" (UID: \"3c85cb04-363e-45d6-a14b-79c249e8f469\") " pod="openshift-infra/auto-csr-approver-29556862-mpx4w" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.330651 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tmtl\" (UniqueName: \"kubernetes.io/projected/3c85cb04-363e-45d6-a14b-79c249e8f469-kube-api-access-6tmtl\") pod \"auto-csr-approver-29556862-mpx4w\" (UID: \"3c85cb04-363e-45d6-a14b-79c249e8f469\") " pod="openshift-infra/auto-csr-approver-29556862-mpx4w" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.461369 4898 generic.go:334] "Generic (PLEG): container finished" podID="604a0205-6c18-4bff-929f-038524d62aeb" containerID="974d31c9fc27c07800ab40a1409496611af9fce07ca6cd7d36cd6abbd352b819" exitCode=0 Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.461441 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bf5d8b7d4-4gwxr" event={"ID":"604a0205-6c18-4bff-929f-038524d62aeb","Type":"ContainerDied","Data":"974d31c9fc27c07800ab40a1409496611af9fce07ca6cd7d36cd6abbd352b819"} Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.508224 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.682277 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" probeResult="failure" output=< Mar 13 14:22:00 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:22:00 crc kubenswrapper[4898]: > Mar 13 14:22:02 crc kubenswrapper[4898]: I0313 14:22:02.486603 4898 generic.go:334] "Generic (PLEG): container finished" podID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerID="7f741df9b30455c96ea278c501c4d63ab1af4bf960776e0726cfee685819c6bd" exitCode=0 Mar 13 14:22:02 crc kubenswrapper[4898]: I0313 14:22:02.486677 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d0be2003-4a0d-4740-9b84-ab16bb27d5bb","Type":"ContainerDied","Data":"7f741df9b30455c96ea278c501c4d63ab1af4bf960776e0726cfee685819c6bd"} Mar 13 14:22:04 crc kubenswrapper[4898]: I0313 14:22:04.893121 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="bda33d23-490a-4099-954b-c613ab5d5c73" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.219:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:22:05 crc kubenswrapper[4898]: I0313 14:22:05.888087 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="bda33d23-490a-4099-954b-c613ab5d5c73" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.219:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.422422 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5b6c75676b-jx6kl"] Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.423986 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.438478 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5978dd6d84-pnknr"] Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.440254 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.451227 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5b6c75676b-jx6kl"] Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.467351 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5978dd6d84-pnknr"] Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.482551 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-76545f46cd-qk7nm"] Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.487941 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.502721 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76545f46cd-qk7nm"] Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.525618 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data-custom\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.526184 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.526311 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-combined-ca-bundle\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.526499 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data-custom\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.526646 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.526924 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kskc6\" (UniqueName: \"kubernetes.io/projected/ad94280e-6f02-4129-9cdc-c35499f5d5e4-kube-api-access-kskc6\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.527125 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbntq\" (UniqueName: \"kubernetes.io/projected/6f42d66e-f331-4c05-a4fb-d6208b4493fb-kube-api-access-jbntq\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.527320 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-combined-ca-bundle\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.632524 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p272j\" (UniqueName: \"kubernetes.io/projected/8c6a61ba-babd-4bc2-922a-99b00c2af057-kube-api-access-p272j\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.632577 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.632601 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-combined-ca-bundle\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.632740 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data-custom\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.632861 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.632889 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-combined-ca-bundle\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.633022 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data-custom\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.633054 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.633392 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kskc6\" (UniqueName: \"kubernetes.io/projected/ad94280e-6f02-4129-9cdc-c35499f5d5e4-kube-api-access-kskc6\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.633464 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data-custom\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.633557 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-combined-ca-bundle\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.633578 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbntq\" (UniqueName: \"kubernetes.io/projected/6f42d66e-f331-4c05-a4fb-d6208b4493fb-kube-api-access-jbntq\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.644503 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.646598 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data-custom\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.647178 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-combined-ca-bundle\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.655266 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data-custom\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.656017 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-combined-ca-bundle\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.658579 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbntq\" (UniqueName: \"kubernetes.io/projected/6f42d66e-f331-4c05-a4fb-d6208b4493fb-kube-api-access-jbntq\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.663722 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kskc6\" (UniqueName: \"kubernetes.io/projected/ad94280e-6f02-4129-9cdc-c35499f5d5e4-kube-api-access-kskc6\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.665878 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.735148 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-combined-ca-bundle\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.735378 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data-custom\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.735438 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p272j\" (UniqueName: \"kubernetes.io/projected/8c6a61ba-babd-4bc2-922a-99b00c2af057-kube-api-access-p272j\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.735463 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.741192 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.749888 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-combined-ca-bundle\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.752642 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data-custom\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.760573 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p272j\" (UniqueName: \"kubernetes.io/projected/8c6a61ba-babd-4bc2-922a-99b00c2af057-kube-api-access-p272j\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.769201 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.810954 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.849140 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.872254 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:07 crc kubenswrapper[4898]: I0313 14:22:07.796493 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:22:07 crc kubenswrapper[4898]: I0313 14:22:07.890465 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f97c64464-wmnph"] Mar 13 14:22:07 crc kubenswrapper[4898]: I0313 14:22:07.890763 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f97c64464-wmnph" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerName="neutron-api" containerID="cri-o://6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded" gracePeriod=30 Mar 13 14:22:07 crc kubenswrapper[4898]: I0313 14:22:07.892337 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f97c64464-wmnph" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerName="neutron-httpd" containerID="cri-o://fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4" gracePeriod=30 Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.112057 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.306533 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.313756 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data\") pod \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.313828 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-internal-tls-certs\") pod \"604a0205-6c18-4bff-929f-038524d62aeb\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.313864 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/604a0205-6c18-4bff-929f-038524d62aeb-logs\") pod \"604a0205-6c18-4bff-929f-038524d62aeb\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.313923 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data-custom\") pod \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.313945 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-public-tls-certs\") pod \"604a0205-6c18-4bff-929f-038524d62aeb\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.313989 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lcxt\" (UniqueName: \"kubernetes.io/projected/604a0205-6c18-4bff-929f-038524d62aeb-kube-api-access-8lcxt\") pod \"604a0205-6c18-4bff-929f-038524d62aeb\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.314071 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-scripts\") pod \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.314100 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-scripts\") pod \"604a0205-6c18-4bff-929f-038524d62aeb\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.314161 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-etc-machine-id\") pod \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.314186 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wbnw\" (UniqueName: \"kubernetes.io/projected/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-kube-api-access-8wbnw\") pod \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.314249 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-config-data\") pod \"604a0205-6c18-4bff-929f-038524d62aeb\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.314278 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-combined-ca-bundle\") pod \"604a0205-6c18-4bff-929f-038524d62aeb\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.314307 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-combined-ca-bundle\") pod \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.326922 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/604a0205-6c18-4bff-929f-038524d62aeb-logs" (OuterVolumeSpecName: "logs") pod "604a0205-6c18-4bff-929f-038524d62aeb" (UID: "604a0205-6c18-4bff-929f-038524d62aeb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.327277 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d0be2003-4a0d-4740-9b84-ab16bb27d5bb" (UID: "d0be2003-4a0d-4740-9b84-ab16bb27d5bb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.348923 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604a0205-6c18-4bff-929f-038524d62aeb-kube-api-access-8lcxt" (OuterVolumeSpecName: "kube-api-access-8lcxt") pod "604a0205-6c18-4bff-929f-038524d62aeb" (UID: "604a0205-6c18-4bff-929f-038524d62aeb"). InnerVolumeSpecName "kube-api-access-8lcxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.349424 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-scripts" (OuterVolumeSpecName: "scripts") pod "604a0205-6c18-4bff-929f-038524d62aeb" (UID: "604a0205-6c18-4bff-929f-038524d62aeb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.349843 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d0be2003-4a0d-4740-9b84-ab16bb27d5bb" (UID: "d0be2003-4a0d-4740-9b84-ab16bb27d5bb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.362164 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-scripts" (OuterVolumeSpecName: "scripts") pod "d0be2003-4a0d-4740-9b84-ab16bb27d5bb" (UID: "d0be2003-4a0d-4740-9b84-ab16bb27d5bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.392028 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-kube-api-access-8wbnw" (OuterVolumeSpecName: "kube-api-access-8wbnw") pod "d0be2003-4a0d-4740-9b84-ab16bb27d5bb" (UID: "d0be2003-4a0d-4740-9b84-ab16bb27d5bb"). InnerVolumeSpecName "kube-api-access-8wbnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.417067 4898 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.417108 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wbnw\" (UniqueName: \"kubernetes.io/projected/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-kube-api-access-8wbnw\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.417124 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/604a0205-6c18-4bff-929f-038524d62aeb-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.417136 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.417147 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lcxt\" (UniqueName: \"kubernetes.io/projected/604a0205-6c18-4bff-929f-038524d62aeb-kube-api-access-8lcxt\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.417156 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.417166 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.586304 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0be2003-4a0d-4740-9b84-ab16bb27d5bb" (UID: "d0be2003-4a0d-4740-9b84-ab16bb27d5bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.629059 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.632785 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d0be2003-4a0d-4740-9b84-ab16bb27d5bb","Type":"ContainerDied","Data":"519cfe6e0c33250225df4155f05016fa2ba7c8a1bb229003e79f90a22978c180"} Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.632849 4898 scope.go:117] "RemoveContainer" containerID="bafd55f18270e2838b59b970f12a6aafedd22b09a08b2d2305599aac961e6911" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.633042 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.638283 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bf5d8b7d4-4gwxr" event={"ID":"604a0205-6c18-4bff-929f-038524d62aeb","Type":"ContainerDied","Data":"762f4a73911e1feb65bcc4bfa54920b33c4f644904f274acd49cf5dac053c911"} Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.638405 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.640341 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.640765 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="ceilometer-central-agent" containerID="cri-o://b62d7bd0ca3497c43d915b6212935946bc82ac1a3defe8c89eeb3779d6ce9770" gracePeriod=30 Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.641094 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="proxy-httpd" containerID="cri-o://a55576c9a44e83505bf8757afc0e1e19424b4717e80f08c508a794c81f2cfdb0" gracePeriod=30 Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.641155 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="sg-core" containerID="cri-o://ba94a825cfb36ee16c3e15907274f9276083ba448d310d471374f19c54cc116c" gracePeriod=30 Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.641201 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="ceilometer-notification-agent" containerID="cri-o://cea1936f2758016544cbefa24e4ca686c3e33acfdaf019898c501a90320d0242" gracePeriod=30 Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.657655 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"124bd4ee-d9f0-408f-a46e-4d143e8ab02a","Type":"ContainerStarted","Data":"2280ba58973731a979fad5257ad117a923cdfbcb04a055a7d907232e58c71a1c"} Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.664621 4898 generic.go:334] "Generic (PLEG): container finished" podID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerID="fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4" exitCode=0 Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.664875 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f97c64464-wmnph" event={"ID":"61f1f8bf-63eb-464c-9703-3d3db80ba0df","Type":"ContainerDied","Data":"fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4"} Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.665988 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "604a0205-6c18-4bff-929f-038524d62aeb" (UID: "604a0205-6c18-4bff-929f-038524d62aeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.694093 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "604a0205-6c18-4bff-929f-038524d62aeb" (UID: "604a0205-6c18-4bff-929f-038524d62aeb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.696068 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-config-data" (OuterVolumeSpecName: "config-data") pod "604a0205-6c18-4bff-929f-038524d62aeb" (UID: "604a0205-6c18-4bff-929f-038524d62aeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.733346 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.733392 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.733404 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.735385 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-9b6c99f6d-7zgm5"] Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.767185 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data" (OuterVolumeSpecName: "config-data") pod "d0be2003-4a0d-4740-9b84-ab16bb27d5bb" (UID: "d0be2003-4a0d-4740-9b84-ab16bb27d5bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.813868 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "604a0205-6c18-4bff-929f-038524d62aeb" (UID: "604a0205-6c18-4bff-929f-038524d62aeb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.837039 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.837078 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.958160 4898 scope.go:117] "RemoveContainer" containerID="7f741df9b30455c96ea278c501c4d63ab1af4bf960776e0726cfee685819c6bd" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.018965 4898 scope.go:117] "RemoveContainer" containerID="974d31c9fc27c07800ab40a1409496611af9fce07ca6cd7d36cd6abbd352b819" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.039264 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5bf5d8b7d4-4gwxr"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.065859 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5bf5d8b7d4-4gwxr"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.068677 4898 scope.go:117] "RemoveContainer" containerID="7d8e485964b16b478ba92c0abf89ef5c7fe78d1b940606e5a01f0ef264ccdb32" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.070116 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.084166 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.100952 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.122968 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:22:09 crc kubenswrapper[4898]: E0313 14:22:09.123530 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604a0205-6c18-4bff-929f-038524d62aeb" containerName="placement-log" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.123548 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="604a0205-6c18-4bff-929f-038524d62aeb" containerName="placement-log" Mar 13 14:22:09 crc kubenswrapper[4898]: E0313 14:22:09.123569 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerName="probe" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.123576 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerName="probe" Mar 13 14:22:09 crc kubenswrapper[4898]: E0313 14:22:09.123615 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerName="cinder-scheduler" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.123623 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerName="cinder-scheduler" Mar 13 14:22:09 crc kubenswrapper[4898]: E0313 14:22:09.123634 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604a0205-6c18-4bff-929f-038524d62aeb" containerName="placement-api" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.123639 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="604a0205-6c18-4bff-929f-038524d62aeb" containerName="placement-api" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.123839 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerName="probe" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.123856 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="604a0205-6c18-4bff-929f-038524d62aeb" containerName="placement-api" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.123869 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="604a0205-6c18-4bff-929f-038524d62aeb" containerName="placement-log" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.123887 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerName="cinder-scheduler" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.132778 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.137663 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.142361 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.329366 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6b86699784-tf822"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.336312 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.348681 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.352483 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-config-data\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.352952 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j59jd\" (UniqueName: \"kubernetes.io/projected/a9a7064c-4ed5-4948-9e7e-7d40794e371e-kube-api-access-j59jd\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.353223 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9a7064c-4ed5-4948-9e7e-7d40794e371e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.353481 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-scripts\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.338179 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5978dd6d84-pnknr"] Mar 13 14:22:09 crc kubenswrapper[4898]: W0313 14:22:09.349078 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88ab3ad2_782a_4c21_8104_1b80468dbca0.slice/crio-92091403be154312d0e01cc88ae4975c8ee84b62f142756e9f0eee0701f6969b WatchSource:0}: Error finding container 92091403be154312d0e01cc88ae4975c8ee84b62f142756e9f0eee0701f6969b: Status 404 returned error can't find the container with id 92091403be154312d0e01cc88ae4975c8ee84b62f142756e9f0eee0701f6969b Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.388057 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-759d64ffd4-kzp67"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.460274 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j59jd\" (UniqueName: \"kubernetes.io/projected/a9a7064c-4ed5-4948-9e7e-7d40794e371e-kube-api-access-j59jd\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.461285 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9a7064c-4ed5-4948-9e7e-7d40794e371e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.461438 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-scripts\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.461645 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.461744 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.461858 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-config-data\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.467018 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9a7064c-4ed5-4948-9e7e-7d40794e371e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.470437 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.487338 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-scripts\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.488486 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.489475 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-config-data\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.512581 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j59jd\" (UniqueName: \"kubernetes.io/projected/a9a7064c-4ed5-4948-9e7e-7d40794e371e-kube-api-access-j59jd\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.579723 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7f9cbdc5df-5tx5z"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.607481 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.815433 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="604a0205-6c18-4bff-929f-038524d62aeb" path="/var/lib/kubelet/pods/604a0205-6c18-4bff-929f-038524d62aeb/volumes" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.822244 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" path="/var/lib/kubelet/pods/d0be2003-4a0d-4740-9b84-ab16bb27d5bb/volumes" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.834792 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-9b6c99f6d-7zgm5"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.837556 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" event={"ID":"1a57db04-0dc9-4d63-8d08-dd4309b19496","Type":"ContainerStarted","Data":"2ecb041f0f57627a4628f79981fbe3e37f771ddc6fb2d41c9c87e318fbbafd91"} Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.859450 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-759d64ffd4-kzp67"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.868520 4898 generic.go:334] "Generic (PLEG): container finished" podID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerID="a55576c9a44e83505bf8757afc0e1e19424b4717e80f08c508a794c81f2cfdb0" exitCode=0 Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.868559 4898 generic.go:334] "Generic (PLEG): container finished" podID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerID="ba94a825cfb36ee16c3e15907274f9276083ba448d310d471374f19c54cc116c" exitCode=2 Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.868569 4898 generic.go:334] "Generic (PLEG): container finished" podID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerID="b62d7bd0ca3497c43d915b6212935946bc82ac1a3defe8c89eeb3779d6ce9770" exitCode=0 Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.868657 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerDied","Data":"a55576c9a44e83505bf8757afc0e1e19424b4717e80f08c508a794c81f2cfdb0"} Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.868692 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerDied","Data":"ba94a825cfb36ee16c3e15907274f9276083ba448d310d471374f19c54cc116c"} Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.868704 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerDied","Data":"b62d7bd0ca3497c43d915b6212935946bc82ac1a3defe8c89eeb3779d6ce9770"} Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.872735 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5f97b49ff6-67dbr"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.880191 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.890470 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.891842 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.898200 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5978dd6d84-pnknr" event={"ID":"6f42d66e-f331-4c05-a4fb-d6208b4493fb","Type":"ContainerStarted","Data":"83eeb629576ceac1e4c3211f16c303bcac0592684b9a1724a45b87fae4f69938"} Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.914218 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" event={"ID":"e53d1b61-e0c8-4c10-85bf-1c0f67009a24","Type":"ContainerStarted","Data":"4e540b110f512e47246a7e1e8d332b0a3a04ba375a44ae46bbb913a91b51ab5a"} Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.915733 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5f97b49ff6-67dbr"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.933226 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-76b5758c54-vpp67"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.934838 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.939486 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9b6c99f6d-7zgm5" event={"ID":"ad3d61d7-d777-4115-92c7-e4e3125c5260","Type":"ContainerStarted","Data":"9c7395afa41324a0f82874b3c28b6ce2289ed61f6812ec39a8b7176eb1dd6a99"} Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.940388 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.946225 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6b86699784-tf822" event={"ID":"88ab3ad2-782a-4c21-8104-1b80468dbca0","Type":"ContainerStarted","Data":"92091403be154312d0e01cc88ae4975c8ee84b62f142756e9f0eee0701f6969b"} Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.947931 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.984180 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-internal-tls-certs\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.984224 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-public-tls-certs\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.984310 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.984396 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-combined-ca-bundle\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.984444 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz9q2\" (UniqueName: \"kubernetes.io/projected/0a9180e2-91e9-4063-83a5-5b4ba75ca011-kube-api-access-bz9q2\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.984941 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data-custom\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.990816 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76b5758c54-vpp67"] Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.023486 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556862-mpx4w"] Mar 13 14:22:10 crc kubenswrapper[4898]: W0313 14:22:10.051205 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c85cb04_363e_45d6_a14b_79c249e8f469.slice/crio-f45ac4cfc3eacf3c82e4d4e57524fced35ccc73e61bc0b3e80601e75041e8b40 WatchSource:0}: Error finding container f45ac4cfc3eacf3c82e4d4e57524fced35ccc73e61bc0b3e80601e75041e8b40: Status 404 returned error can't find the container with id f45ac4cfc3eacf3c82e4d4e57524fced35ccc73e61bc0b3e80601e75041e8b40 Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.070569 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5b6c75676b-jx6kl"] Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.087856 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz9q2\" (UniqueName: \"kubernetes.io/projected/0a9180e2-91e9-4063-83a5-5b4ba75ca011-kube-api-access-bz9q2\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.092594 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-internal-tls-certs\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.092808 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data-custom\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.093008 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.095547 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkpj4\" (UniqueName: \"kubernetes.io/projected/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-kube-api-access-tkpj4\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.095782 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data-custom\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.095962 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-internal-tls-certs\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.095989 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-public-tls-certs\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.096046 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-combined-ca-bundle\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.096120 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-public-tls-certs\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.098097 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.098320 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-combined-ca-bundle\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.102463 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-public-tls-certs\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.118098 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz9q2\" (UniqueName: \"kubernetes.io/projected/0a9180e2-91e9-4063-83a5-5b4ba75ca011-kube-api-access-bz9q2\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.137404 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.039866858 podStartE2EDuration="19.137371682s" podCreationTimestamp="2026-03-13 14:21:51 +0000 UTC" firstStartedPulling="2026-03-13 14:21:52.635438901 +0000 UTC m=+1547.637027140" lastFinishedPulling="2026-03-13 14:22:07.732943725 +0000 UTC m=+1562.734531964" observedRunningTime="2026-03-13 14:22:09.986322241 +0000 UTC m=+1564.987910480" watchObservedRunningTime="2026-03-13 14:22:10.137371682 +0000 UTC m=+1565.138959911" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.178550 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-internal-tls-certs\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.181753 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xntfr"] Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.187485 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-combined-ca-bundle\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.187647 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.187770 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data-custom\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.204857 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76545f46cd-qk7nm"] Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.206470 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-internal-tls-certs\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.206598 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data-custom\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.206706 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.206831 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkpj4\" (UniqueName: \"kubernetes.io/projected/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-kube-api-access-tkpj4\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.207014 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-combined-ca-bundle\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.207115 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-public-tls-certs\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.219465 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data-custom\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.225421 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-combined-ca-bundle\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.225638 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.226070 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-public-tls-certs\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.233270 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkpj4\" (UniqueName: \"kubernetes.io/projected/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-kube-api-access-tkpj4\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.245067 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-internal-tls-certs\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.250772 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.275585 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.640396 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.641443 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" probeResult="failure" output=< Mar 13 14:22:10 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:22:10 crc kubenswrapper[4898]: > Mar 13 14:22:10 crc kubenswrapper[4898]: W0313 14:22:10.657118 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9a7064c_4ed5_4948_9e7e_7d40794e371e.slice/crio-fae467477196fd229044597b43b86230463815cebabe0d5642c9fe93f9d884e6 WatchSource:0}: Error finding container fae467477196fd229044597b43b86230463815cebabe0d5642c9fe93f9d884e6: Status 404 returned error can't find the container with id fae467477196fd229044597b43b86230463815cebabe0d5642c9fe93f9d884e6 Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.973121 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" event={"ID":"8c6a61ba-babd-4bc2-922a-99b00c2af057","Type":"ContainerStarted","Data":"703728a9002cd85f33faecdfc398f12cdcb1c38f0ab174f12467daf84a0e062a"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:10.981672 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5b6c75676b-jx6kl" event={"ID":"ad94280e-6f02-4129-9cdc-c35499f5d5e4","Type":"ContainerStarted","Data":"75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:10.981749 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5b6c75676b-jx6kl" event={"ID":"ad94280e-6f02-4129-9cdc-c35499f5d5e4","Type":"ContainerStarted","Data":"49d0d8e35c38306e9d9d2a68f113990c44c74cb3b5a7d200ee672f1ee07d5629"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:10.984036 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:10.991657 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" event={"ID":"1a57db04-0dc9-4d63-8d08-dd4309b19496","Type":"ContainerStarted","Data":"6cc3f5f6bfa1471670ea1bda49b78865e22324a8edcd3e45cb92e533fe2a84f1"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.015508 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5b6c75676b-jx6kl" podStartSLOduration=5.015469417 podStartE2EDuration="5.015469417s" podCreationTimestamp="2026-03-13 14:22:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:11.00249073 +0000 UTC m=+1566.004078989" watchObservedRunningTime="2026-03-13 14:22:11.015469417 +0000 UTC m=+1566.017057656" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.024739 4898 generic.go:334] "Generic (PLEG): container finished" podID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerID="cea1936f2758016544cbefa24e4ca686c3e33acfdaf019898c501a90320d0242" exitCode=0 Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.024814 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerDied","Data":"cea1936f2758016544cbefa24e4ca686c3e33acfdaf019898c501a90320d0242"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.024851 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerDied","Data":"2b43c112fe6b642ffc81d63835d5208293191491d23e2c20e5ef660540956b7c"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.024866 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b43c112fe6b642ffc81d63835d5208293191491d23e2c20e5ef660540956b7c" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.031389 4898 generic.go:334] "Generic (PLEG): container finished" podID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" containerID="1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61" exitCode=0 Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.031528 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" event={"ID":"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6","Type":"ContainerDied","Data":"1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.031558 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" event={"ID":"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6","Type":"ContainerStarted","Data":"c39986f2a0c08da0dd84aef4031cbde75ea3fecd89eca753618403b386b96b49"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.045932 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" event={"ID":"3c85cb04-363e-45d6-a14b-79c249e8f469","Type":"ContainerStarted","Data":"f45ac4cfc3eacf3c82e4d4e57524fced35ccc73e61bc0b3e80601e75041e8b40"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.060351 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6b86699784-tf822" event={"ID":"88ab3ad2-782a-4c21-8104-1b80468dbca0","Type":"ContainerStarted","Data":"99ddc8edc7229de6f1b448d188e98b54d68182339989edb36f76125f198ea2d9"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.060998 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.068793 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9a7064c-4ed5-4948-9e7e-7d40794e371e","Type":"ContainerStarted","Data":"fae467477196fd229044597b43b86230463815cebabe0d5642c9fe93f9d884e6"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.105717 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.120890 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6b86699784-tf822" podStartSLOduration=12.120863463 podStartE2EDuration="12.120863463s" podCreationTimestamp="2026-03-13 14:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:11.073636997 +0000 UTC m=+1566.075225246" watchObservedRunningTime="2026-03-13 14:22:11.120863463 +0000 UTC m=+1566.122451702" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.237292 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-combined-ca-bundle\") pod \"86c6c495-884b-4c92-949f-0159eb17e6a5\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.237464 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-scripts\") pod \"86c6c495-884b-4c92-949f-0159eb17e6a5\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.237494 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-sg-core-conf-yaml\") pod \"86c6c495-884b-4c92-949f-0159eb17e6a5\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.237555 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-log-httpd\") pod \"86c6c495-884b-4c92-949f-0159eb17e6a5\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.237581 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-config-data\") pod \"86c6c495-884b-4c92-949f-0159eb17e6a5\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.237624 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tvsh\" (UniqueName: \"kubernetes.io/projected/86c6c495-884b-4c92-949f-0159eb17e6a5-kube-api-access-8tvsh\") pod \"86c6c495-884b-4c92-949f-0159eb17e6a5\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.237664 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-run-httpd\") pod \"86c6c495-884b-4c92-949f-0159eb17e6a5\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.240107 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "86c6c495-884b-4c92-949f-0159eb17e6a5" (UID: "86c6c495-884b-4c92-949f-0159eb17e6a5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.240479 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "86c6c495-884b-4c92-949f-0159eb17e6a5" (UID: "86c6c495-884b-4c92-949f-0159eb17e6a5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.242118 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.242598 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.252023 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86c6c495-884b-4c92-949f-0159eb17e6a5-kube-api-access-8tvsh" (OuterVolumeSpecName: "kube-api-access-8tvsh") pod "86c6c495-884b-4c92-949f-0159eb17e6a5" (UID: "86c6c495-884b-4c92-949f-0159eb17e6a5"). InnerVolumeSpecName "kube-api-access-8tvsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.256140 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-scripts" (OuterVolumeSpecName: "scripts") pod "86c6c495-884b-4c92-949f-0159eb17e6a5" (UID: "86c6c495-884b-4c92-949f-0159eb17e6a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.264340 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76b5758c54-vpp67"] Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.307954 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5f97b49ff6-67dbr"] Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.324860 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "86c6c495-884b-4c92-949f-0159eb17e6a5" (UID: "86c6c495-884b-4c92-949f-0159eb17e6a5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.348923 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.348955 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.348964 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tvsh\" (UniqueName: \"kubernetes.io/projected/86c6c495-884b-4c92-949f-0159eb17e6a5-kube-api-access-8tvsh\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.442477 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86c6c495-884b-4c92-949f-0159eb17e6a5" (UID: "86c6c495-884b-4c92-949f-0159eb17e6a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.460671 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.507022 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-config-data" (OuterVolumeSpecName: "config-data") pod "86c6c495-884b-4c92-949f-0159eb17e6a5" (UID: "86c6c495-884b-4c92-949f-0159eb17e6a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.564321 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.082758 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76b5758c54-vpp67" event={"ID":"bd18ec2e-1196-4e66-a1c5-9e3daefd7171","Type":"ContainerStarted","Data":"910b4f0096d337a99b11342bf407ba83fa521765a628b5f816cab829a8a6b279"} Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.086229 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9a7064c-4ed5-4948-9e7e-7d40794e371e","Type":"ContainerStarted","Data":"6b88f7ff52506e35b8cbb7e231ccb1285ae276c1a9d69c02b14ca5282f74157d"} Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.087990 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" event={"ID":"1a57db04-0dc9-4d63-8d08-dd4309b19496","Type":"ContainerStarted","Data":"89d1231d67e96e72d5e3c165afd5a70443483d484b8a2a3b799a394c1438df35"} Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.088180 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.093028 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f97b49ff6-67dbr" event={"ID":"0a9180e2-91e9-4063-83a5-5b4ba75ca011","Type":"ContainerStarted","Data":"f8532eef577636e4f0bae3c5d04fb7af834ed690d3b4263f9713e1e00c75cbcb"} Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.097858 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" event={"ID":"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6","Type":"ContainerStarted","Data":"12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a"} Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.098635 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.104547 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" event={"ID":"3c85cb04-363e-45d6-a14b-79c249e8f469","Type":"ContainerStarted","Data":"0d9a86d7e906b1015484fb8d8fc360af30fd3aeeed5cfca12314a74be70b110a"} Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.105173 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.133217 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" podStartSLOduration=13.133173743 podStartE2EDuration="13.133173743s" podCreationTimestamp="2026-03-13 14:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:12.116685205 +0000 UTC m=+1567.118273454" watchObservedRunningTime="2026-03-13 14:22:12.133173743 +0000 UTC m=+1567.134761982" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.134966 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" podStartSLOduration=11.158422718 podStartE2EDuration="12.134954869s" podCreationTimestamp="2026-03-13 14:22:00 +0000 UTC" firstStartedPulling="2026-03-13 14:22:10.046693458 +0000 UTC m=+1565.048281697" lastFinishedPulling="2026-03-13 14:22:11.023225609 +0000 UTC m=+1566.024813848" observedRunningTime="2026-03-13 14:22:12.132411583 +0000 UTC m=+1567.133999822" watchObservedRunningTime="2026-03-13 14:22:12.134954869 +0000 UTC m=+1567.136543108" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.167589 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" podStartSLOduration=13.167568006 podStartE2EDuration="13.167568006s" podCreationTimestamp="2026-03-13 14:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:12.153409488 +0000 UTC m=+1567.154997747" watchObservedRunningTime="2026-03-13 14:22:12.167568006 +0000 UTC m=+1567.169156255" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.212322 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.248112 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.260737 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:12 crc kubenswrapper[4898]: E0313 14:22:12.261295 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="ceilometer-central-agent" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.261316 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="ceilometer-central-agent" Mar 13 14:22:12 crc kubenswrapper[4898]: E0313 14:22:12.261329 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="proxy-httpd" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.261336 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="proxy-httpd" Mar 13 14:22:12 crc kubenswrapper[4898]: E0313 14:22:12.261351 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="ceilometer-notification-agent" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.261357 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="ceilometer-notification-agent" Mar 13 14:22:12 crc kubenswrapper[4898]: E0313 14:22:12.261385 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="sg-core" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.261390 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="sg-core" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.261608 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="ceilometer-notification-agent" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.261628 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="sg-core" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.261639 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="proxy-httpd" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.261652 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="ceilometer-central-agent" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.263600 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.266798 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.268569 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.275250 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.307486 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.307548 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjr2t\" (UniqueName: \"kubernetes.io/projected/e251995e-609a-4f0e-83f3-7f856e58a598-kube-api-access-hjr2t\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.307598 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-scripts\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.307671 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-config-data\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.307760 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.307787 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-run-httpd\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.307832 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-log-httpd\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.410674 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.410745 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-run-httpd\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.410818 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-log-httpd\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.410958 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.410996 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjr2t\" (UniqueName: \"kubernetes.io/projected/e251995e-609a-4f0e-83f3-7f856e58a598-kube-api-access-hjr2t\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.411120 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-scripts\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.411257 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-config-data\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.411699 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-log-httpd\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.412178 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-run-httpd\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.417811 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.422976 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-config-data\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.426526 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-scripts\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.431542 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjr2t\" (UniqueName: \"kubernetes.io/projected/e251995e-609a-4f0e-83f3-7f856e58a598-kube-api-access-hjr2t\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.439353 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.637543 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:13 crc kubenswrapper[4898]: I0313 14:22:13.131727 4898 generic.go:334] "Generic (PLEG): container finished" podID="3c85cb04-363e-45d6-a14b-79c249e8f469" containerID="0d9a86d7e906b1015484fb8d8fc360af30fd3aeeed5cfca12314a74be70b110a" exitCode=0 Mar 13 14:22:13 crc kubenswrapper[4898]: I0313 14:22:13.132072 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" event={"ID":"3c85cb04-363e-45d6-a14b-79c249e8f469","Type":"ContainerDied","Data":"0d9a86d7e906b1015484fb8d8fc360af30fd3aeeed5cfca12314a74be70b110a"} Mar 13 14:22:13 crc kubenswrapper[4898]: I0313 14:22:13.132447 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:22:13 crc kubenswrapper[4898]: I0313 14:22:13.753432 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" path="/var/lib/kubelet/pods/86c6c495-884b-4c92-949f-0159eb17e6a5/volumes" Mar 13 14:22:14 crc kubenswrapper[4898]: I0313 14:22:14.711066 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" Mar 13 14:22:14 crc kubenswrapper[4898]: I0313 14:22:14.806623 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tmtl\" (UniqueName: \"kubernetes.io/projected/3c85cb04-363e-45d6-a14b-79c249e8f469-kube-api-access-6tmtl\") pod \"3c85cb04-363e-45d6-a14b-79c249e8f469\" (UID: \"3c85cb04-363e-45d6-a14b-79c249e8f469\") " Mar 13 14:22:14 crc kubenswrapper[4898]: I0313 14:22:14.815069 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c85cb04-363e-45d6-a14b-79c249e8f469-kube-api-access-6tmtl" (OuterVolumeSpecName: "kube-api-access-6tmtl") pod "3c85cb04-363e-45d6-a14b-79c249e8f469" (UID: "3c85cb04-363e-45d6-a14b-79c249e8f469"). InnerVolumeSpecName "kube-api-access-6tmtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:14 crc kubenswrapper[4898]: I0313 14:22:14.909629 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tmtl\" (UniqueName: \"kubernetes.io/projected/3c85cb04-363e-45d6-a14b-79c249e8f469-kube-api-access-6tmtl\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.125967 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.181278 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5978dd6d84-pnknr" event={"ID":"6f42d66e-f331-4c05-a4fb-d6208b4493fb","Type":"ContainerStarted","Data":"8f11edb4f03060d32d47258f0f26105368d5a72b3961df0611efa132599db56c"} Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.181382 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.216380 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-httpd-config\") pod \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.218962 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" event={"ID":"e53d1b61-e0c8-4c10-85bf-1c0f67009a24","Type":"ContainerStarted","Data":"c6cbd243a4ab0ae3ee88d2b14b07e0b9c8fda594949edb73fc92248b8f25ddf9"} Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.219113 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" podUID="e53d1b61-e0c8-4c10-85bf-1c0f67009a24" containerName="heat-cfnapi" containerID="cri-o://c6cbd243a4ab0ae3ee88d2b14b07e0b9c8fda594949edb73fc92248b8f25ddf9" gracePeriod=60 Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.219288 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.216510 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-ovndb-tls-certs\") pod \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.236857 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-combined-ca-bundle\") pod \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.242115 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rskd\" (UniqueName: \"kubernetes.io/projected/61f1f8bf-63eb-464c-9703-3d3db80ba0df-kube-api-access-6rskd\") pod \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.242180 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-config\") pod \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.262442 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-9b6c99f6d-7zgm5" podUID="ad3d61d7-d777-4115-92c7-e4e3125c5260" containerName="heat-api" containerID="cri-o://954199ed87793fe013823cc99558f408a84d0a4c4745073ecc940b8eedc022cd" gracePeriod=60 Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.262613 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9b6c99f6d-7zgm5" event={"ID":"ad3d61d7-d777-4115-92c7-e4e3125c5260","Type":"ContainerStarted","Data":"954199ed87793fe013823cc99558f408a84d0a4c4745073ecc940b8eedc022cd"} Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.262673 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.280135 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.281164 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" event={"ID":"3c85cb04-363e-45d6-a14b-79c249e8f469","Type":"ContainerDied","Data":"f45ac4cfc3eacf3c82e4d4e57524fced35ccc73e61bc0b3e80601e75041e8b40"} Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.281207 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f45ac4cfc3eacf3c82e4d4e57524fced35ccc73e61bc0b3e80601e75041e8b40" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.283620 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5978dd6d84-pnknr" podStartSLOduration=4.185198646 podStartE2EDuration="9.283604597s" podCreationTimestamp="2026-03-13 14:22:06 +0000 UTC" firstStartedPulling="2026-03-13 14:22:09.395585667 +0000 UTC m=+1564.397173906" lastFinishedPulling="2026-03-13 14:22:14.493991618 +0000 UTC m=+1569.495579857" observedRunningTime="2026-03-13 14:22:15.217288115 +0000 UTC m=+1570.218876354" watchObservedRunningTime="2026-03-13 14:22:15.283604597 +0000 UTC m=+1570.285192826" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.294468 4898 generic.go:334] "Generic (PLEG): container finished" podID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerID="6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded" exitCode=0 Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.294528 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f97c64464-wmnph" event={"ID":"61f1f8bf-63eb-464c-9703-3d3db80ba0df","Type":"ContainerDied","Data":"6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded"} Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.294564 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f97c64464-wmnph" event={"ID":"61f1f8bf-63eb-464c-9703-3d3db80ba0df","Type":"ContainerDied","Data":"0ed18fce368036b02906d995e0994f8f4a7d0e035afb43b24689faf9dc556daf"} Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.294588 4898 scope.go:117] "RemoveContainer" containerID="fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.294803 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.296206 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f1f8bf-63eb-464c-9703-3d3db80ba0df-kube-api-access-6rskd" (OuterVolumeSpecName: "kube-api-access-6rskd") pod "61f1f8bf-63eb-464c-9703-3d3db80ba0df" (UID: "61f1f8bf-63eb-464c-9703-3d3db80ba0df"). InnerVolumeSpecName "kube-api-access-6rskd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.313536 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "61f1f8bf-63eb-464c-9703-3d3db80ba0df" (UID: "61f1f8bf-63eb-464c-9703-3d3db80ba0df"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.346319 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.346349 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rskd\" (UniqueName: \"kubernetes.io/projected/61f1f8bf-63eb-464c-9703-3d3db80ba0df-kube-api-access-6rskd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.357795 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.419772 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556856-z8p4g"] Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.450808 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556856-z8p4g"] Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.470994 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" podStartSLOduration=11.430071402 podStartE2EDuration="16.470976701s" podCreationTimestamp="2026-03-13 14:21:59 +0000 UTC" firstStartedPulling="2026-03-13 14:22:09.418209974 +0000 UTC m=+1564.419798213" lastFinishedPulling="2026-03-13 14:22:14.459115273 +0000 UTC m=+1569.460703512" observedRunningTime="2026-03-13 14:22:15.268616537 +0000 UTC m=+1570.270204776" watchObservedRunningTime="2026-03-13 14:22:15.470976701 +0000 UTC m=+1570.472564940" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.491192 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-9b6c99f6d-7zgm5" podStartSLOduration=10.818390143 podStartE2EDuration="16.491169075s" podCreationTimestamp="2026-03-13 14:21:59 +0000 UTC" firstStartedPulling="2026-03-13 14:22:08.786529196 +0000 UTC m=+1563.788117435" lastFinishedPulling="2026-03-13 14:22:14.459308118 +0000 UTC m=+1569.460896367" observedRunningTime="2026-03-13 14:22:15.288497374 +0000 UTC m=+1570.290085613" watchObservedRunningTime="2026-03-13 14:22:15.491169075 +0000 UTC m=+1570.492757314" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.638216 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61f1f8bf-63eb-464c-9703-3d3db80ba0df" (UID: "61f1f8bf-63eb-464c-9703-3d3db80ba0df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.657007 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.723650 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-config" (OuterVolumeSpecName: "config") pod "61f1f8bf-63eb-464c-9703-3d3db80ba0df" (UID: "61f1f8bf-63eb-464c-9703-3d3db80ba0df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.760235 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.775535 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b81468c-e1ac-4515-837d-993e3c5108c9" path="/var/lib/kubelet/pods/5b81468c-e1ac-4515-837d-993e3c5108c9/volumes" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.933770 4898 scope.go:117] "RemoveContainer" containerID="6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.941680 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "61f1f8bf-63eb-464c-9703-3d3db80ba0df" (UID: "61f1f8bf-63eb-464c-9703-3d3db80ba0df"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.967742 4898 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.022810 4898 scope.go:117] "RemoveContainer" containerID="fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4" Mar 13 14:22:16 crc kubenswrapper[4898]: E0313 14:22:16.023276 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4\": container with ID starting with fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4 not found: ID does not exist" containerID="fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.023319 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4"} err="failed to get container status \"fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4\": rpc error: code = NotFound desc = could not find container \"fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4\": container with ID starting with fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4 not found: ID does not exist" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.023339 4898 scope.go:117] "RemoveContainer" containerID="6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded" Mar 13 14:22:16 crc kubenswrapper[4898]: E0313 14:22:16.023532 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded\": container with ID starting with 6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded not found: ID does not exist" containerID="6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.023556 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded"} err="failed to get container status \"6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded\": rpc error: code = NotFound desc = could not find container \"6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded\": container with ID starting with 6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded not found: ID does not exist" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.309482 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f97b49ff6-67dbr" event={"ID":"0a9180e2-91e9-4063-83a5-5b4ba75ca011","Type":"ContainerStarted","Data":"357d95fc5d8e0d7678f3ea6e54b764f618b159ca6eaa129aa3499f515c52ea42"} Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.309863 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.311599 4898 generic.go:334] "Generic (PLEG): container finished" podID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" containerID="8f11edb4f03060d32d47258f0f26105368d5a72b3961df0611efa132599db56c" exitCode=1 Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.311831 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5978dd6d84-pnknr" event={"ID":"6f42d66e-f331-4c05-a4fb-d6208b4493fb","Type":"ContainerDied","Data":"8f11edb4f03060d32d47258f0f26105368d5a72b3961df0611efa132599db56c"} Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.312354 4898 scope.go:117] "RemoveContainer" containerID="8f11edb4f03060d32d47258f0f26105368d5a72b3961df0611efa132599db56c" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.315767 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76b5758c54-vpp67" event={"ID":"bd18ec2e-1196-4e66-a1c5-9e3daefd7171","Type":"ContainerStarted","Data":"458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e"} Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.315889 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.323442 4898 generic.go:334] "Generic (PLEG): container finished" podID="8c6a61ba-babd-4bc2-922a-99b00c2af057" containerID="ef85f89a78107532e3b7ef83d63a9d6a9985b54d2525db305b02b87cede351c6" exitCode=1 Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.323496 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" event={"ID":"8c6a61ba-babd-4bc2-922a-99b00c2af057","Type":"ContainerDied","Data":"ef85f89a78107532e3b7ef83d63a9d6a9985b54d2525db305b02b87cede351c6"} Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.324309 4898 scope.go:117] "RemoveContainer" containerID="ef85f89a78107532e3b7ef83d63a9d6a9985b54d2525db305b02b87cede351c6" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.328488 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerStarted","Data":"eb628c158bccb3144900d410778ea134ed3ea0eddc185afc79f0a4381f9e188c"} Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.407542 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5f97b49ff6-67dbr" podStartSLOduration=4.169061834 podStartE2EDuration="7.407520643s" podCreationTimestamp="2026-03-13 14:22:09 +0000 UTC" firstStartedPulling="2026-03-13 14:22:11.289599424 +0000 UTC m=+1566.291187663" lastFinishedPulling="2026-03-13 14:22:14.528058233 +0000 UTC m=+1569.529646472" observedRunningTime="2026-03-13 14:22:16.345602236 +0000 UTC m=+1571.347190475" watchObservedRunningTime="2026-03-13 14:22:16.407520643 +0000 UTC m=+1571.409108882" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.443379 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-76b5758c54-vpp67" podStartSLOduration=4.208154039 podStartE2EDuration="7.443145928s" podCreationTimestamp="2026-03-13 14:22:09 +0000 UTC" firstStartedPulling="2026-03-13 14:22:11.315062165 +0000 UTC m=+1566.316650404" lastFinishedPulling="2026-03-13 14:22:14.550054054 +0000 UTC m=+1569.551642293" observedRunningTime="2026-03-13 14:22:16.396313442 +0000 UTC m=+1571.397901691" watchObservedRunningTime="2026-03-13 14:22:16.443145928 +0000 UTC m=+1571.444734167" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.475433 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f97c64464-wmnph"] Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.486557 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f97c64464-wmnph"] Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.854727 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.875998 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.876050 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.341495 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9a7064c-4ed5-4948-9e7e-7d40794e371e","Type":"ContainerStarted","Data":"715462d9bf65ff0e13b9e6ca02779b8f60f62b7fdf58a8171a568c3f23c25ccb"} Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.344892 4898 generic.go:334] "Generic (PLEG): container finished" podID="8c6a61ba-babd-4bc2-922a-99b00c2af057" containerID="ee395bb97804b9c181c98691f0ac177fe6c008f2b77b6a0f5cb7a636bfe0789a" exitCode=1 Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.345069 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" event={"ID":"8c6a61ba-babd-4bc2-922a-99b00c2af057","Type":"ContainerDied","Data":"ee395bb97804b9c181c98691f0ac177fe6c008f2b77b6a0f5cb7a636bfe0789a"} Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.345107 4898 scope.go:117] "RemoveContainer" containerID="ef85f89a78107532e3b7ef83d63a9d6a9985b54d2525db305b02b87cede351c6" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.345998 4898 scope.go:117] "RemoveContainer" containerID="ee395bb97804b9c181c98691f0ac177fe6c008f2b77b6a0f5cb7a636bfe0789a" Mar 13 14:22:17 crc kubenswrapper[4898]: E0313 14:22:17.346251 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-76545f46cd-qk7nm_openstack(8c6a61ba-babd-4bc2-922a-99b00c2af057)\"" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.350417 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerStarted","Data":"8aff6294a407bae6a2eb1e2dc4f0f935834538560ddb004e22ea984aea78200b"} Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.357987 4898 generic.go:334] "Generic (PLEG): container finished" podID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" containerID="d08fae9298b9b14b84a2c4d5726d6db0f8d8453e2834240ede2ff7d43ec4ebb3" exitCode=1 Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.359321 4898 scope.go:117] "RemoveContainer" containerID="d08fae9298b9b14b84a2c4d5726d6db0f8d8453e2834240ede2ff7d43ec4ebb3" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.359510 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5978dd6d84-pnknr" event={"ID":"6f42d66e-f331-4c05-a4fb-d6208b4493fb","Type":"ContainerDied","Data":"d08fae9298b9b14b84a2c4d5726d6db0f8d8453e2834240ede2ff7d43ec4ebb3"} Mar 13 14:22:17 crc kubenswrapper[4898]: E0313 14:22:17.359632 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5978dd6d84-pnknr_openstack(6f42d66e-f331-4c05-a4fb-d6208b4493fb)\"" pod="openstack/heat-api-5978dd6d84-pnknr" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.380934 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=8.3808735 podStartE2EDuration="8.3808735s" podCreationTimestamp="2026-03-13 14:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:17.37548816 +0000 UTC m=+1572.377076429" watchObservedRunningTime="2026-03-13 14:22:17.3808735 +0000 UTC m=+1572.382461759" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.482270 4898 scope.go:117] "RemoveContainer" containerID="8f11edb4f03060d32d47258f0f26105368d5a72b3961df0611efa132599db56c" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.737213 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-sqtw8"] Mar 13 14:22:17 crc kubenswrapper[4898]: E0313 14:22:17.737712 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerName="neutron-httpd" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.737729 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerName="neutron-httpd" Mar 13 14:22:17 crc kubenswrapper[4898]: E0313 14:22:17.737770 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c85cb04-363e-45d6-a14b-79c249e8f469" containerName="oc" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.737776 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c85cb04-363e-45d6-a14b-79c249e8f469" containerName="oc" Mar 13 14:22:17 crc kubenswrapper[4898]: E0313 14:22:17.737785 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerName="neutron-api" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.737790 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerName="neutron-api" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.738021 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerName="neutron-httpd" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.738046 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c85cb04-363e-45d6-a14b-79c249e8f469" containerName="oc" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.738059 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerName="neutron-api" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.738822 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.757018 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" path="/var/lib/kubelet/pods/61f1f8bf-63eb-464c-9703-3d3db80ba0df/volumes" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.760769 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-sqtw8"] Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.815941 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qlgbl"] Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.817546 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.842980 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qlgbl"] Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.901258 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-978d-account-create-update-tkc22"] Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.902871 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.912289 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-978d-account-create-update-tkc22"] Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.912679 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.924189 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/068b0856-126d-487c-9c1d-50299bf90d3a-operator-scripts\") pod \"nova-api-db-create-sqtw8\" (UID: \"068b0856-126d-487c-9c1d-50299bf90d3a\") " pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.924264 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw6kd\" (UniqueName: \"kubernetes.io/projected/068b0856-126d-487c-9c1d-50299bf90d3a-kube-api-access-nw6kd\") pod \"nova-api-db-create-sqtw8\" (UID: \"068b0856-126d-487c-9c1d-50299bf90d3a\") " pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.924355 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f1f531-99d1-4b97-bd08-6bf94a7afd92-operator-scripts\") pod \"nova-cell0-db-create-qlgbl\" (UID: \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\") " pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.924376 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4bb2\" (UniqueName: \"kubernetes.io/projected/44f1f531-99d1-4b97-bd08-6bf94a7afd92-kube-api-access-m4bb2\") pod \"nova-cell0-db-create-qlgbl\" (UID: \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\") " pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.003375 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-zhx84"] Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.004854 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.012732 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zhx84"] Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.027004 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw6kd\" (UniqueName: \"kubernetes.io/projected/068b0856-126d-487c-9c1d-50299bf90d3a-kube-api-access-nw6kd\") pod \"nova-api-db-create-sqtw8\" (UID: \"068b0856-126d-487c-9c1d-50299bf90d3a\") " pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.027107 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f1f531-99d1-4b97-bd08-6bf94a7afd92-operator-scripts\") pod \"nova-cell0-db-create-qlgbl\" (UID: \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\") " pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.027127 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4bb2\" (UniqueName: \"kubernetes.io/projected/44f1f531-99d1-4b97-bd08-6bf94a7afd92-kube-api-access-m4bb2\") pod \"nova-cell0-db-create-qlgbl\" (UID: \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\") " pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.027196 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29dbeb8a-611d-4513-a063-06d8f865ea93-operator-scripts\") pod \"nova-api-978d-account-create-update-tkc22\" (UID: \"29dbeb8a-611d-4513-a063-06d8f865ea93\") " pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.027263 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkf6x\" (UniqueName: \"kubernetes.io/projected/29dbeb8a-611d-4513-a063-06d8f865ea93-kube-api-access-bkf6x\") pod \"nova-api-978d-account-create-update-tkc22\" (UID: \"29dbeb8a-611d-4513-a063-06d8f865ea93\") " pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.027326 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/068b0856-126d-487c-9c1d-50299bf90d3a-operator-scripts\") pod \"nova-api-db-create-sqtw8\" (UID: \"068b0856-126d-487c-9c1d-50299bf90d3a\") " pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.028034 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/068b0856-126d-487c-9c1d-50299bf90d3a-operator-scripts\") pod \"nova-api-db-create-sqtw8\" (UID: \"068b0856-126d-487c-9c1d-50299bf90d3a\") " pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.029488 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f1f531-99d1-4b97-bd08-6bf94a7afd92-operator-scripts\") pod \"nova-cell0-db-create-qlgbl\" (UID: \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\") " pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.050075 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4bb2\" (UniqueName: \"kubernetes.io/projected/44f1f531-99d1-4b97-bd08-6bf94a7afd92-kube-api-access-m4bb2\") pod \"nova-cell0-db-create-qlgbl\" (UID: \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\") " pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.052349 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw6kd\" (UniqueName: \"kubernetes.io/projected/068b0856-126d-487c-9c1d-50299bf90d3a-kube-api-access-nw6kd\") pod \"nova-api-db-create-sqtw8\" (UID: \"068b0856-126d-487c-9c1d-50299bf90d3a\") " pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.063648 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.123252 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b9c7-account-create-update-l6h97"] Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.125939 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.128013 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.131107 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abedb18-bf27-42d9-b809-f7226b603a0d-operator-scripts\") pod \"nova-cell1-db-create-zhx84\" (UID: \"1abedb18-bf27-42d9-b809-f7226b603a0d\") " pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.131241 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8s4j\" (UniqueName: \"kubernetes.io/projected/1abedb18-bf27-42d9-b809-f7226b603a0d-kube-api-access-s8s4j\") pod \"nova-cell1-db-create-zhx84\" (UID: \"1abedb18-bf27-42d9-b809-f7226b603a0d\") " pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.131340 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29dbeb8a-611d-4513-a063-06d8f865ea93-operator-scripts\") pod \"nova-api-978d-account-create-update-tkc22\" (UID: \"29dbeb8a-611d-4513-a063-06d8f865ea93\") " pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.131409 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkf6x\" (UniqueName: \"kubernetes.io/projected/29dbeb8a-611d-4513-a063-06d8f865ea93-kube-api-access-bkf6x\") pod \"nova-api-978d-account-create-update-tkc22\" (UID: \"29dbeb8a-611d-4513-a063-06d8f865ea93\") " pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.132515 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29dbeb8a-611d-4513-a063-06d8f865ea93-operator-scripts\") pod \"nova-api-978d-account-create-update-tkc22\" (UID: \"29dbeb8a-611d-4513-a063-06d8f865ea93\") " pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.141731 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b9c7-account-create-update-l6h97"] Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.166689 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.181442 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkf6x\" (UniqueName: \"kubernetes.io/projected/29dbeb8a-611d-4513-a063-06d8f865ea93-kube-api-access-bkf6x\") pod \"nova-api-978d-account-create-update-tkc22\" (UID: \"29dbeb8a-611d-4513-a063-06d8f865ea93\") " pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.229696 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.233414 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzzw4\" (UniqueName: \"kubernetes.io/projected/e516311e-fb5c-4901-aaf7-67793ffb5fa2-kube-api-access-mzzw4\") pod \"nova-cell0-b9c7-account-create-update-l6h97\" (UID: \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\") " pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.233465 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8s4j\" (UniqueName: \"kubernetes.io/projected/1abedb18-bf27-42d9-b809-f7226b603a0d-kube-api-access-s8s4j\") pod \"nova-cell1-db-create-zhx84\" (UID: \"1abedb18-bf27-42d9-b809-f7226b603a0d\") " pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.233540 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e516311e-fb5c-4901-aaf7-67793ffb5fa2-operator-scripts\") pod \"nova-cell0-b9c7-account-create-update-l6h97\" (UID: \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\") " pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.233622 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abedb18-bf27-42d9-b809-f7226b603a0d-operator-scripts\") pod \"nova-cell1-db-create-zhx84\" (UID: \"1abedb18-bf27-42d9-b809-f7226b603a0d\") " pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.238946 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abedb18-bf27-42d9-b809-f7226b603a0d-operator-scripts\") pod \"nova-cell1-db-create-zhx84\" (UID: \"1abedb18-bf27-42d9-b809-f7226b603a0d\") " pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.253946 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8s4j\" (UniqueName: \"kubernetes.io/projected/1abedb18-bf27-42d9-b809-f7226b603a0d-kube-api-access-s8s4j\") pod \"nova-cell1-db-create-zhx84\" (UID: \"1abedb18-bf27-42d9-b809-f7226b603a0d\") " pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.320258 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-96c4-account-create-update-zsh5t"] Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.321747 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.332219 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.337469 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzzw4\" (UniqueName: \"kubernetes.io/projected/e516311e-fb5c-4901-aaf7-67793ffb5fa2-kube-api-access-mzzw4\") pod \"nova-cell0-b9c7-account-create-update-l6h97\" (UID: \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\") " pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.337621 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e516311e-fb5c-4901-aaf7-67793ffb5fa2-operator-scripts\") pod \"nova-cell0-b9c7-account-create-update-l6h97\" (UID: \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\") " pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.360698 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e516311e-fb5c-4901-aaf7-67793ffb5fa2-operator-scripts\") pod \"nova-cell0-b9c7-account-create-update-l6h97\" (UID: \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\") " pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.372521 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzzw4\" (UniqueName: \"kubernetes.io/projected/e516311e-fb5c-4901-aaf7-67793ffb5fa2-kube-api-access-mzzw4\") pod \"nova-cell0-b9c7-account-create-update-l6h97\" (UID: \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\") " pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.386513 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-96c4-account-create-update-zsh5t"] Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.394591 4898 scope.go:117] "RemoveContainer" containerID="ee395bb97804b9c181c98691f0ac177fe6c008f2b77b6a0f5cb7a636bfe0789a" Mar 13 14:22:18 crc kubenswrapper[4898]: E0313 14:22:18.394948 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-76545f46cd-qk7nm_openstack(8c6a61ba-babd-4bc2-922a-99b00c2af057)\"" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.401520 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerStarted","Data":"d8d1f76b83bf115de46d0b110c2cf03b3ebde454f4675c453b288cf8a3d3f58a"} Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.415630 4898 scope.go:117] "RemoveContainer" containerID="d08fae9298b9b14b84a2c4d5726d6db0f8d8453e2834240ede2ff7d43ec4ebb3" Mar 13 14:22:18 crc kubenswrapper[4898]: E0313 14:22:18.415834 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5978dd6d84-pnknr_openstack(6f42d66e-f331-4c05-a4fb-d6208b4493fb)\"" pod="openstack/heat-api-5978dd6d84-pnknr" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.441298 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/485200a5-cd75-45ac-b93a-b003158132c4-operator-scripts\") pod \"nova-cell1-96c4-account-create-update-zsh5t\" (UID: \"485200a5-cd75-45ac-b93a-b003158132c4\") " pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.441436 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w42dl\" (UniqueName: \"kubernetes.io/projected/485200a5-cd75-45ac-b93a-b003158132c4-kube-api-access-w42dl\") pod \"nova-cell1-96c4-account-create-update-zsh5t\" (UID: \"485200a5-cd75-45ac-b93a-b003158132c4\") " pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.508068 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.518293 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.543493 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/485200a5-cd75-45ac-b93a-b003158132c4-operator-scripts\") pod \"nova-cell1-96c4-account-create-update-zsh5t\" (UID: \"485200a5-cd75-45ac-b93a-b003158132c4\") " pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.543625 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w42dl\" (UniqueName: \"kubernetes.io/projected/485200a5-cd75-45ac-b93a-b003158132c4-kube-api-access-w42dl\") pod \"nova-cell1-96c4-account-create-update-zsh5t\" (UID: \"485200a5-cd75-45ac-b93a-b003158132c4\") " pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.548154 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/485200a5-cd75-45ac-b93a-b003158132c4-operator-scripts\") pod \"nova-cell1-96c4-account-create-update-zsh5t\" (UID: \"485200a5-cd75-45ac-b93a-b003158132c4\") " pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.567997 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w42dl\" (UniqueName: \"kubernetes.io/projected/485200a5-cd75-45ac-b93a-b003158132c4-kube-api-access-w42dl\") pod \"nova-cell1-96c4-account-create-update-zsh5t\" (UID: \"485200a5-cd75-45ac-b93a-b003158132c4\") " pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.832933 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.134000 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.134054 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.134096 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.134936 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.134982 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" gracePeriod=600 Mar 13 14:22:19 crc kubenswrapper[4898]: E0313 14:22:19.276945 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.429656 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerStarted","Data":"003d77a3bae8b5d30451a8b2b210d256e8f645cab42759f96a2cca43d38b49a5"} Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.445958 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" exitCode=0 Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.446056 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc"} Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.446102 4898 scope.go:117] "RemoveContainer" containerID="37bdbe6f1a65f1530746827b4e6d1dd1ce95edb9a913051fc8fca9a782787e56" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.447240 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:22:19 crc kubenswrapper[4898]: E0313 14:22:19.447723 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.509500 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-sqtw8"] Mar 13 14:22:19 crc kubenswrapper[4898]: W0313 14:22:19.577482 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod068b0856_126d_487c_9c1d_50299bf90d3a.slice/crio-52a2cbda59b7d4003f96766e7ac8026a8135f15213a313db58524fcdf356ad84 WatchSource:0}: Error finding container 52a2cbda59b7d4003f96766e7ac8026a8135f15213a313db58524fcdf356ad84: Status 404 returned error can't find the container with id 52a2cbda59b7d4003f96766e7ac8026a8135f15213a313db58524fcdf356ad84 Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.579561 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qlgbl"] Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.599196 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-978d-account-create-update-tkc22"] Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.608250 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.621497 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zhx84"] Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.625781 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.649347 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.671970 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b9c7-account-create-update-l6h97"] Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.804289 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-96c4-account-create-update-zsh5t"] Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.984022 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.093773 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gtnnh"] Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.094977 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" podUID="99ea68d3-f555-4779-90d0-d1f136ddadd2" containerName="dnsmasq-dns" containerID="cri-o://4a5c75faeafd5fd73d57b281c113bb58d89f329bfae70b866f45093f1de113f3" gracePeriod=10 Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.211874 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.482145 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-sqtw8" event={"ID":"068b0856-126d-487c-9c1d-50299bf90d3a","Type":"ContainerStarted","Data":"52a2cbda59b7d4003f96766e7ac8026a8135f15213a313db58524fcdf356ad84"} Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.543100 4898 generic.go:334] "Generic (PLEG): container finished" podID="99ea68d3-f555-4779-90d0-d1f136ddadd2" containerID="4a5c75faeafd5fd73d57b281c113bb58d89f329bfae70b866f45093f1de113f3" exitCode=0 Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.543438 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" event={"ID":"99ea68d3-f555-4779-90d0-d1f136ddadd2","Type":"ContainerDied","Data":"4a5c75faeafd5fd73d57b281c113bb58d89f329bfae70b866f45093f1de113f3"} Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.550683 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qlgbl" event={"ID":"44f1f531-99d1-4b97-bd08-6bf94a7afd92","Type":"ContainerStarted","Data":"c3f324e654c4694796399f736ba82201f21f4c72e8656d1a41457de581a31d5e"} Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.563970 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" event={"ID":"485200a5-cd75-45ac-b93a-b003158132c4","Type":"ContainerStarted","Data":"7e15455495730456e7a8bff672640fcee90a6ae705923706dc2aa854e8ab7ed5"} Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.572651 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zhx84" event={"ID":"1abedb18-bf27-42d9-b809-f7226b603a0d","Type":"ContainerStarted","Data":"50f9054e56407ba0a9fea287b30973d6add7ed954d891fc3616c3d5b3283065f"} Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.584701 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-978d-account-create-update-tkc22" event={"ID":"29dbeb8a-611d-4513-a063-06d8f865ea93","Type":"ContainerStarted","Data":"26bae5e708003894bf328daf1be0e3e64680a6bf96fbf9b626133d828b1312af"} Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.595492 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" event={"ID":"e516311e-fb5c-4901-aaf7-67793ffb5fa2","Type":"ContainerStarted","Data":"f9ad8ed01a8a4a46b06ba41d288cebad2344ae1d2a57232e2dd62e53b5ce8da8"} Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.621905 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-978d-account-create-update-tkc22" podStartSLOduration=3.621850455 podStartE2EDuration="3.621850455s" podCreationTimestamp="2026-03-13 14:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:20.61816493 +0000 UTC m=+1575.619753169" watchObservedRunningTime="2026-03-13 14:22:20.621850455 +0000 UTC m=+1575.623438704" Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.657069 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" probeResult="failure" output=< Mar 13 14:22:20 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:22:20 crc kubenswrapper[4898]: > Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.232218 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.384231 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-nb\") pod \"99ea68d3-f555-4779-90d0-d1f136ddadd2\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.384329 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-swift-storage-0\") pod \"99ea68d3-f555-4779-90d0-d1f136ddadd2\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.384356 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-sb\") pod \"99ea68d3-f555-4779-90d0-d1f136ddadd2\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.384427 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-svc\") pod \"99ea68d3-f555-4779-90d0-d1f136ddadd2\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.384477 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-config\") pod \"99ea68d3-f555-4779-90d0-d1f136ddadd2\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.384653 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktnr4\" (UniqueName: \"kubernetes.io/projected/99ea68d3-f555-4779-90d0-d1f136ddadd2-kube-api-access-ktnr4\") pod \"99ea68d3-f555-4779-90d0-d1f136ddadd2\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.421751 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ea68d3-f555-4779-90d0-d1f136ddadd2-kube-api-access-ktnr4" (OuterVolumeSpecName: "kube-api-access-ktnr4") pod "99ea68d3-f555-4779-90d0-d1f136ddadd2" (UID: "99ea68d3-f555-4779-90d0-d1f136ddadd2"). InnerVolumeSpecName "kube-api-access-ktnr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.491691 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktnr4\" (UniqueName: \"kubernetes.io/projected/99ea68d3-f555-4779-90d0-d1f136ddadd2-kube-api-access-ktnr4\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.519303 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "99ea68d3-f555-4779-90d0-d1f136ddadd2" (UID: "99ea68d3-f555-4779-90d0-d1f136ddadd2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.521866 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "99ea68d3-f555-4779-90d0-d1f136ddadd2" (UID: "99ea68d3-f555-4779-90d0-d1f136ddadd2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.529209 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99ea68d3-f555-4779-90d0-d1f136ddadd2" (UID: "99ea68d3-f555-4779-90d0-d1f136ddadd2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.575613 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-config" (OuterVolumeSpecName: "config") pod "99ea68d3-f555-4779-90d0-d1f136ddadd2" (UID: "99ea68d3-f555-4779-90d0-d1f136ddadd2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.594074 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.594113 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.594124 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.594133 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.605930 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99ea68d3-f555-4779-90d0-d1f136ddadd2" (UID: "99ea68d3-f555-4779-90d0-d1f136ddadd2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.610760 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" event={"ID":"99ea68d3-f555-4779-90d0-d1f136ddadd2","Type":"ContainerDied","Data":"2f6cf6b2237006a47af92c80edb293fb5e39aa92cbe683d435727b4ad4952d2e"} Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.610829 4898 scope.go:117] "RemoveContainer" containerID="4a5c75faeafd5fd73d57b281c113bb58d89f329bfae70b866f45093f1de113f3" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.611009 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.632069 4898 generic.go:334] "Generic (PLEG): container finished" podID="44f1f531-99d1-4b97-bd08-6bf94a7afd92" containerID="8dc6d76d86edf4ca1fbe4589c8cd285e45e3e3a632208ca7817a0099916d678e" exitCode=0 Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.632141 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qlgbl" event={"ID":"44f1f531-99d1-4b97-bd08-6bf94a7afd92","Type":"ContainerDied","Data":"8dc6d76d86edf4ca1fbe4589c8cd285e45e3e3a632208ca7817a0099916d678e"} Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.642709 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" event={"ID":"485200a5-cd75-45ac-b93a-b003158132c4","Type":"ContainerStarted","Data":"388808695f9ab04c5365cf0b1e925d316f5fa781ee2d5e7df5edc4ae9b9090f3"} Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.673578 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" podStartSLOduration=3.673562187 podStartE2EDuration="3.673562187s" podCreationTimestamp="2026-03-13 14:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:21.66600129 +0000 UTC m=+1576.667589529" watchObservedRunningTime="2026-03-13 14:22:21.673562187 +0000 UTC m=+1576.675150426" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.676197 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zhx84" event={"ID":"1abedb18-bf27-42d9-b809-f7226b603a0d","Type":"ContainerStarted","Data":"990c87abba6cb97ddcc9ff9fc6fc009eeddad1ca5f265916404c317707eccb6f"} Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.697229 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-zhx84" podStartSLOduration=4.69721144 podStartE2EDuration="4.69721144s" podCreationTimestamp="2026-03-13 14:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:21.6917927 +0000 UTC m=+1576.693380939" watchObservedRunningTime="2026-03-13 14:22:21.69721144 +0000 UTC m=+1576.698799679" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.698155 4898 generic.go:334] "Generic (PLEG): container finished" podID="29dbeb8a-611d-4513-a063-06d8f865ea93" containerID="7b608b8ed7e8f0c428a761cc552755850cceca5a0b694b7beffed50aa397bd7c" exitCode=0 Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.698228 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-978d-account-create-update-tkc22" event={"ID":"29dbeb8a-611d-4513-a063-06d8f865ea93","Type":"ContainerDied","Data":"7b608b8ed7e8f0c428a761cc552755850cceca5a0b694b7beffed50aa397bd7c"} Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.702343 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.719822 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerStarted","Data":"f8d1b2554e005cc4ebc5dedc934e5b72d453f242bad5cf70707745dc2f5c1c07"} Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.719871 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.741663 4898 scope.go:117] "RemoveContainer" containerID="6c42ee9c0a17acfdf5d9f3b6de5ee36bb640854b185b6dd5e7f1e7441cc93008" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.742343 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" event={"ID":"e516311e-fb5c-4901-aaf7-67793ffb5fa2","Type":"ContainerStarted","Data":"bb0c3bbe6081c4840c08e83355257eccaf898a1f20509cda6e941396919762d8"} Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.758410 4898 generic.go:334] "Generic (PLEG): container finished" podID="068b0856-126d-487c-9c1d-50299bf90d3a" containerID="bf99d1df7658057773b414b4c2a04b114eb5afc6efdb7e3669f974780f737076" exitCode=0 Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.764549 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-sqtw8" event={"ID":"068b0856-126d-487c-9c1d-50299bf90d3a","Type":"ContainerDied","Data":"bf99d1df7658057773b414b4c2a04b114eb5afc6efdb7e3669f974780f737076"} Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.785099 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gtnnh"] Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.818594 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gtnnh"] Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.847942 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.405013268 podStartE2EDuration="9.847889222s" podCreationTimestamp="2026-03-13 14:22:12 +0000 UTC" firstStartedPulling="2026-03-13 14:22:15.335444402 +0000 UTC m=+1570.337032641" lastFinishedPulling="2026-03-13 14:22:20.778320356 +0000 UTC m=+1575.779908595" observedRunningTime="2026-03-13 14:22:21.759111107 +0000 UTC m=+1576.760699356" watchObservedRunningTime="2026-03-13 14:22:21.847889222 +0000 UTC m=+1576.849477461" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.851057 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.851875 4898 scope.go:117] "RemoveContainer" containerID="d08fae9298b9b14b84a2c4d5726d6db0f8d8453e2834240ede2ff7d43ec4ebb3" Mar 13 14:22:21 crc kubenswrapper[4898]: E0313 14:22:21.852131 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5978dd6d84-pnknr_openstack(6f42d66e-f331-4c05-a4fb-d6208b4493fb)\"" pod="openstack/heat-api-5978dd6d84-pnknr" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.852579 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.867243 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" podStartSLOduration=3.8672173340000002 podStartE2EDuration="3.867217334s" podCreationTimestamp="2026-03-13 14:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:21.784147837 +0000 UTC m=+1576.785736076" watchObservedRunningTime="2026-03-13 14:22:21.867217334 +0000 UTC m=+1576.868805593" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.873164 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.873264 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.874229 4898 scope.go:117] "RemoveContainer" containerID="ee395bb97804b9c181c98691f0ac177fe6c008f2b77b6a0f5cb7a636bfe0789a" Mar 13 14:22:21 crc kubenswrapper[4898]: E0313 14:22:21.874568 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-76545f46cd-qk7nm_openstack(8c6a61ba-babd-4bc2-922a-99b00c2af057)\"" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.439875 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.626436 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.637731 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.646650 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.714890 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5978dd6d84-pnknr"] Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.719209 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.780602 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-76545f46cd-qk7nm"] Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.783406 4898 generic.go:334] "Generic (PLEG): container finished" podID="485200a5-cd75-45ac-b93a-b003158132c4" containerID="388808695f9ab04c5365cf0b1e925d316f5fa781ee2d5e7df5edc4ae9b9090f3" exitCode=0 Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.783481 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" event={"ID":"485200a5-cd75-45ac-b93a-b003158132c4","Type":"ContainerDied","Data":"388808695f9ab04c5365cf0b1e925d316f5fa781ee2d5e7df5edc4ae9b9090f3"} Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.788074 4898 generic.go:334] "Generic (PLEG): container finished" podID="1abedb18-bf27-42d9-b809-f7226b603a0d" containerID="990c87abba6cb97ddcc9ff9fc6fc009eeddad1ca5f265916404c317707eccb6f" exitCode=0 Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.788158 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zhx84" event={"ID":"1abedb18-bf27-42d9-b809-f7226b603a0d","Type":"ContainerDied","Data":"990c87abba6cb97ddcc9ff9fc6fc009eeddad1ca5f265916404c317707eccb6f"} Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.799180 4898 generic.go:334] "Generic (PLEG): container finished" podID="e516311e-fb5c-4901-aaf7-67793ffb5fa2" containerID="bb0c3bbe6081c4840c08e83355257eccaf898a1f20509cda6e941396919762d8" exitCode=0 Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.799230 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" event={"ID":"e516311e-fb5c-4901-aaf7-67793ffb5fa2","Type":"ContainerDied","Data":"bb0c3bbe6081c4840c08e83355257eccaf898a1f20509cda6e941396919762d8"} Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.800628 4898 scope.go:117] "RemoveContainer" containerID="ee395bb97804b9c181c98691f0ac177fe6c008f2b77b6a0f5cb7a636bfe0789a" Mar 13 14:22:22 crc kubenswrapper[4898]: E0313 14:22:22.800832 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-76545f46cd-qk7nm_openstack(8c6a61ba-babd-4bc2-922a-99b00c2af057)\"" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.804407 4898 scope.go:117] "RemoveContainer" containerID="d08fae9298b9b14b84a2c4d5726d6db0f8d8453e2834240ede2ff7d43ec4ebb3" Mar 13 14:22:22 crc kubenswrapper[4898]: E0313 14:22:22.804822 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5978dd6d84-pnknr_openstack(6f42d66e-f331-4c05-a4fb-d6208b4493fb)\"" pod="openstack/heat-api-5978dd6d84-pnknr" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.360221 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.457835 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f1f531-99d1-4b97-bd08-6bf94a7afd92-operator-scripts\") pod \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\" (UID: \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\") " Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.463049 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44f1f531-99d1-4b97-bd08-6bf94a7afd92-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44f1f531-99d1-4b97-bd08-6bf94a7afd92" (UID: "44f1f531-99d1-4b97-bd08-6bf94a7afd92"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.463285 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4bb2\" (UniqueName: \"kubernetes.io/projected/44f1f531-99d1-4b97-bd08-6bf94a7afd92-kube-api-access-m4bb2\") pod \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\" (UID: \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\") " Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.474548 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f1f531-99d1-4b97-bd08-6bf94a7afd92-kube-api-access-m4bb2" (OuterVolumeSpecName: "kube-api-access-m4bb2") pod "44f1f531-99d1-4b97-bd08-6bf94a7afd92" (UID: "44f1f531-99d1-4b97-bd08-6bf94a7afd92"). InnerVolumeSpecName "kube-api-access-m4bb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.477221 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f1f531-99d1-4b97-bd08-6bf94a7afd92-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.580882 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4bb2\" (UniqueName: \"kubernetes.io/projected/44f1f531-99d1-4b97-bd08-6bf94a7afd92-kube-api-access-m4bb2\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.607882 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.622524 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.682806 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw6kd\" (UniqueName: \"kubernetes.io/projected/068b0856-126d-487c-9c1d-50299bf90d3a-kube-api-access-nw6kd\") pod \"068b0856-126d-487c-9c1d-50299bf90d3a\" (UID: \"068b0856-126d-487c-9c1d-50299bf90d3a\") " Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.683076 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/068b0856-126d-487c-9c1d-50299bf90d3a-operator-scripts\") pod \"068b0856-126d-487c-9c1d-50299bf90d3a\" (UID: \"068b0856-126d-487c-9c1d-50299bf90d3a\") " Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.684295 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/068b0856-126d-487c-9c1d-50299bf90d3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "068b0856-126d-487c-9c1d-50299bf90d3a" (UID: "068b0856-126d-487c-9c1d-50299bf90d3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.697715 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/068b0856-126d-487c-9c1d-50299bf90d3a-kube-api-access-nw6kd" (OuterVolumeSpecName: "kube-api-access-nw6kd") pod "068b0856-126d-487c-9c1d-50299bf90d3a" (UID: "068b0856-126d-487c-9c1d-50299bf90d3a"). InnerVolumeSpecName "kube-api-access-nw6kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.754555 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99ea68d3-f555-4779-90d0-d1f136ddadd2" path="/var/lib/kubelet/pods/99ea68d3-f555-4779-90d0-d1f136ddadd2/volumes" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.785238 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkf6x\" (UniqueName: \"kubernetes.io/projected/29dbeb8a-611d-4513-a063-06d8f865ea93-kube-api-access-bkf6x\") pod \"29dbeb8a-611d-4513-a063-06d8f865ea93\" (UID: \"29dbeb8a-611d-4513-a063-06d8f865ea93\") " Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.785574 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29dbeb8a-611d-4513-a063-06d8f865ea93-operator-scripts\") pod \"29dbeb8a-611d-4513-a063-06d8f865ea93\" (UID: \"29dbeb8a-611d-4513-a063-06d8f865ea93\") " Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.786158 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/068b0856-126d-487c-9c1d-50299bf90d3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.786176 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw6kd\" (UniqueName: \"kubernetes.io/projected/068b0856-126d-487c-9c1d-50299bf90d3a-kube-api-access-nw6kd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.786439 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29dbeb8a-611d-4513-a063-06d8f865ea93-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29dbeb8a-611d-4513-a063-06d8f865ea93" (UID: "29dbeb8a-611d-4513-a063-06d8f865ea93"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.798758 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29dbeb8a-611d-4513-a063-06d8f865ea93-kube-api-access-bkf6x" (OuterVolumeSpecName: "kube-api-access-bkf6x") pod "29dbeb8a-611d-4513-a063-06d8f865ea93" (UID: "29dbeb8a-611d-4513-a063-06d8f865ea93"). InnerVolumeSpecName "kube-api-access-bkf6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.812447 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qlgbl" event={"ID":"44f1f531-99d1-4b97-bd08-6bf94a7afd92","Type":"ContainerDied","Data":"c3f324e654c4694796399f736ba82201f21f4c72e8656d1a41457de581a31d5e"} Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.812493 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3f324e654c4694796399f736ba82201f21f4c72e8656d1a41457de581a31d5e" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.812748 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.814682 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-978d-account-create-update-tkc22" event={"ID":"29dbeb8a-611d-4513-a063-06d8f865ea93","Type":"ContainerDied","Data":"26bae5e708003894bf328daf1be0e3e64680a6bf96fbf9b626133d828b1312af"} Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.814713 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26bae5e708003894bf328daf1be0e3e64680a6bf96fbf9b626133d828b1312af" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.814771 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.816850 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-sqtw8" event={"ID":"068b0856-126d-487c-9c1d-50299bf90d3a","Type":"ContainerDied","Data":"52a2cbda59b7d4003f96766e7ac8026a8135f15213a313db58524fcdf356ad84"} Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.816935 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52a2cbda59b7d4003f96766e7ac8026a8135f15213a313db58524fcdf356ad84" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.817025 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.820028 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="ceilometer-central-agent" containerID="cri-o://8aff6294a407bae6a2eb1e2dc4f0f935834538560ddb004e22ea984aea78200b" gracePeriod=30 Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.820326 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="proxy-httpd" containerID="cri-o://f8d1b2554e005cc4ebc5dedc934e5b72d453f242bad5cf70707745dc2f5c1c07" gracePeriod=30 Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.820447 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="sg-core" containerID="cri-o://003d77a3bae8b5d30451a8b2b210d256e8f645cab42759f96a2cca43d38b49a5" gracePeriod=30 Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.820670 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="ceilometer-notification-agent" containerID="cri-o://d8d1f76b83bf115de46d0b110c2cf03b3ebde454f4675c453b288cf8a3d3f58a" gracePeriod=30 Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.888160 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29dbeb8a-611d-4513-a063-06d8f865ea93-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.890374 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkf6x\" (UniqueName: \"kubernetes.io/projected/29dbeb8a-611d-4513-a063-06d8f865ea93-kube-api-access-bkf6x\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.159285 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.303431 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzzw4\" (UniqueName: \"kubernetes.io/projected/e516311e-fb5c-4901-aaf7-67793ffb5fa2-kube-api-access-mzzw4\") pod \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\" (UID: \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.303477 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e516311e-fb5c-4901-aaf7-67793ffb5fa2-operator-scripts\") pod \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\" (UID: \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.304963 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e516311e-fb5c-4901-aaf7-67793ffb5fa2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e516311e-fb5c-4901-aaf7-67793ffb5fa2" (UID: "e516311e-fb5c-4901-aaf7-67793ffb5fa2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.311643 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e516311e-fb5c-4901-aaf7-67793ffb5fa2-kube-api-access-mzzw4" (OuterVolumeSpecName: "kube-api-access-mzzw4") pod "e516311e-fb5c-4901-aaf7-67793ffb5fa2" (UID: "e516311e-fb5c-4901-aaf7-67793ffb5fa2"). InnerVolumeSpecName "kube-api-access-mzzw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.407621 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzzw4\" (UniqueName: \"kubernetes.io/projected/e516311e-fb5c-4901-aaf7-67793ffb5fa2-kube-api-access-mzzw4\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.407653 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e516311e-fb5c-4901-aaf7-67793ffb5fa2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.690718 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.699785 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.724007 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.736055 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.826351 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p272j\" (UniqueName: \"kubernetes.io/projected/8c6a61ba-babd-4bc2-922a-99b00c2af057-kube-api-access-p272j\") pod \"8c6a61ba-babd-4bc2-922a-99b00c2af057\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.826446 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w42dl\" (UniqueName: \"kubernetes.io/projected/485200a5-cd75-45ac-b93a-b003158132c4-kube-api-access-w42dl\") pod \"485200a5-cd75-45ac-b93a-b003158132c4\" (UID: \"485200a5-cd75-45ac-b93a-b003158132c4\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.826472 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8s4j\" (UniqueName: \"kubernetes.io/projected/1abedb18-bf27-42d9-b809-f7226b603a0d-kube-api-access-s8s4j\") pod \"1abedb18-bf27-42d9-b809-f7226b603a0d\" (UID: \"1abedb18-bf27-42d9-b809-f7226b603a0d\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.826568 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data-custom\") pod \"8c6a61ba-babd-4bc2-922a-99b00c2af057\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.826721 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-combined-ca-bundle\") pod \"8c6a61ba-babd-4bc2-922a-99b00c2af057\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.826817 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data\") pod \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.826879 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/485200a5-cd75-45ac-b93a-b003158132c4-operator-scripts\") pod \"485200a5-cd75-45ac-b93a-b003158132c4\" (UID: \"485200a5-cd75-45ac-b93a-b003158132c4\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.826948 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data-custom\") pod \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.827030 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbntq\" (UniqueName: \"kubernetes.io/projected/6f42d66e-f331-4c05-a4fb-d6208b4493fb-kube-api-access-jbntq\") pod \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.827153 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data\") pod \"8c6a61ba-babd-4bc2-922a-99b00c2af057\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.827187 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-combined-ca-bundle\") pod \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.827245 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abedb18-bf27-42d9-b809-f7226b603a0d-operator-scripts\") pod \"1abedb18-bf27-42d9-b809-f7226b603a0d\" (UID: \"1abedb18-bf27-42d9-b809-f7226b603a0d\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.828885 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/485200a5-cd75-45ac-b93a-b003158132c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "485200a5-cd75-45ac-b93a-b003158132c4" (UID: "485200a5-cd75-45ac-b93a-b003158132c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.839398 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f42d66e-f331-4c05-a4fb-d6208b4493fb-kube-api-access-jbntq" (OuterVolumeSpecName: "kube-api-access-jbntq") pod "6f42d66e-f331-4c05-a4fb-d6208b4493fb" (UID: "6f42d66e-f331-4c05-a4fb-d6208b4493fb"). InnerVolumeSpecName "kube-api-access-jbntq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.839476 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6a61ba-babd-4bc2-922a-99b00c2af057-kube-api-access-p272j" (OuterVolumeSpecName: "kube-api-access-p272j") pod "8c6a61ba-babd-4bc2-922a-99b00c2af057" (UID: "8c6a61ba-babd-4bc2-922a-99b00c2af057"). InnerVolumeSpecName "kube-api-access-p272j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.839495 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1abedb18-bf27-42d9-b809-f7226b603a0d-kube-api-access-s8s4j" (OuterVolumeSpecName: "kube-api-access-s8s4j") pod "1abedb18-bf27-42d9-b809-f7226b603a0d" (UID: "1abedb18-bf27-42d9-b809-f7226b603a0d"). InnerVolumeSpecName "kube-api-access-s8s4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.840392 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1abedb18-bf27-42d9-b809-f7226b603a0d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1abedb18-bf27-42d9-b809-f7226b603a0d" (UID: "1abedb18-bf27-42d9-b809-f7226b603a0d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.841434 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8c6a61ba-babd-4bc2-922a-99b00c2af057" (UID: "8c6a61ba-babd-4bc2-922a-99b00c2af057"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.842382 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485200a5-cd75-45ac-b93a-b003158132c4-kube-api-access-w42dl" (OuterVolumeSpecName: "kube-api-access-w42dl") pod "485200a5-cd75-45ac-b93a-b003158132c4" (UID: "485200a5-cd75-45ac-b93a-b003158132c4"). InnerVolumeSpecName "kube-api-access-w42dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.844505 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6f42d66e-f331-4c05-a4fb-d6208b4493fb" (UID: "6f42d66e-f331-4c05-a4fb-d6208b4493fb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.850641 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zhx84" event={"ID":"1abedb18-bf27-42d9-b809-f7226b603a0d","Type":"ContainerDied","Data":"50f9054e56407ba0a9fea287b30973d6add7ed954d891fc3616c3d5b3283065f"} Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.850689 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50f9054e56407ba0a9fea287b30973d6add7ed954d891fc3616c3d5b3283065f" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.850749 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.854067 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" event={"ID":"8c6a61ba-babd-4bc2-922a-99b00c2af057","Type":"ContainerDied","Data":"703728a9002cd85f33faecdfc398f12cdcb1c38f0ab174f12467daf84a0e062a"} Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.854327 4898 scope.go:117] "RemoveContainer" containerID="ee395bb97804b9c181c98691f0ac177fe6c008f2b77b6a0f5cb7a636bfe0789a" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.854512 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.863406 4898 generic.go:334] "Generic (PLEG): container finished" podID="e251995e-609a-4f0e-83f3-7f856e58a598" containerID="f8d1b2554e005cc4ebc5dedc934e5b72d453f242bad5cf70707745dc2f5c1c07" exitCode=0 Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.863431 4898 generic.go:334] "Generic (PLEG): container finished" podID="e251995e-609a-4f0e-83f3-7f856e58a598" containerID="003d77a3bae8b5d30451a8b2b210d256e8f645cab42759f96a2cca43d38b49a5" exitCode=2 Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.863438 4898 generic.go:334] "Generic (PLEG): container finished" podID="e251995e-609a-4f0e-83f3-7f856e58a598" containerID="d8d1f76b83bf115de46d0b110c2cf03b3ebde454f4675c453b288cf8a3d3f58a" exitCode=0 Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.863494 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerDied","Data":"f8d1b2554e005cc4ebc5dedc934e5b72d453f242bad5cf70707745dc2f5c1c07"} Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.863523 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerDied","Data":"003d77a3bae8b5d30451a8b2b210d256e8f645cab42759f96a2cca43d38b49a5"} Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.863533 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerDied","Data":"d8d1f76b83bf115de46d0b110c2cf03b3ebde454f4675c453b288cf8a3d3f58a"} Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.865652 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" event={"ID":"e516311e-fb5c-4901-aaf7-67793ffb5fa2","Type":"ContainerDied","Data":"f9ad8ed01a8a4a46b06ba41d288cebad2344ae1d2a57232e2dd62e53b5ce8da8"} Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.865718 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9ad8ed01a8a4a46b06ba41d288cebad2344ae1d2a57232e2dd62e53b5ce8da8" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.865762 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.880759 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5978dd6d84-pnknr" event={"ID":"6f42d66e-f331-4c05-a4fb-d6208b4493fb","Type":"ContainerDied","Data":"83eeb629576ceac1e4c3211f16c303bcac0592684b9a1724a45b87fae4f69938"} Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.880878 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.884468 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f42d66e-f331-4c05-a4fb-d6208b4493fb" (UID: "6f42d66e-f331-4c05-a4fb-d6208b4493fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.890485 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c6a61ba-babd-4bc2-922a-99b00c2af057" (UID: "8c6a61ba-babd-4bc2-922a-99b00c2af057"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.890812 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" event={"ID":"485200a5-cd75-45ac-b93a-b003158132c4","Type":"ContainerDied","Data":"7e15455495730456e7a8bff672640fcee90a6ae705923706dc2aa854e8ab7ed5"} Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.890851 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e15455495730456e7a8bff672640fcee90a6ae705923706dc2aa854e8ab7ed5" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.891091 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.904117 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data" (OuterVolumeSpecName: "config-data") pod "6f42d66e-f331-4c05-a4fb-d6208b4493fb" (UID: "6f42d66e-f331-4c05-a4fb-d6208b4493fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.905557 4898 scope.go:117] "RemoveContainer" containerID="d08fae9298b9b14b84a2c4d5726d6db0f8d8453e2834240ede2ff7d43ec4ebb3" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932408 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932450 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abedb18-bf27-42d9-b809-f7226b603a0d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932465 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p272j\" (UniqueName: \"kubernetes.io/projected/8c6a61ba-babd-4bc2-922a-99b00c2af057-kube-api-access-p272j\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932484 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w42dl\" (UniqueName: \"kubernetes.io/projected/485200a5-cd75-45ac-b93a-b003158132c4-kube-api-access-w42dl\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932504 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8s4j\" (UniqueName: \"kubernetes.io/projected/1abedb18-bf27-42d9-b809-f7226b603a0d-kube-api-access-s8s4j\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932515 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932527 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932539 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932557 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/485200a5-cd75-45ac-b93a-b003158132c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932569 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932582 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbntq\" (UniqueName: \"kubernetes.io/projected/6f42d66e-f331-4c05-a4fb-d6208b4493fb-kube-api-access-jbntq\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.958644 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data" (OuterVolumeSpecName: "config-data") pod "8c6a61ba-babd-4bc2-922a-99b00c2af057" (UID: "8c6a61ba-babd-4bc2-922a-99b00c2af057"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:25 crc kubenswrapper[4898]: I0313 14:22:25.035202 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:25 crc kubenswrapper[4898]: I0313 14:22:25.208955 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-76545f46cd-qk7nm"] Mar 13 14:22:25 crc kubenswrapper[4898]: I0313 14:22:25.229976 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-76545f46cd-qk7nm"] Mar 13 14:22:25 crc kubenswrapper[4898]: I0313 14:22:25.279054 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5978dd6d84-pnknr"] Mar 13 14:22:25 crc kubenswrapper[4898]: I0313 14:22:25.306688 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5978dd6d84-pnknr"] Mar 13 14:22:25 crc kubenswrapper[4898]: I0313 14:22:25.753046 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" path="/var/lib/kubelet/pods/6f42d66e-f331-4c05-a4fb-d6208b4493fb/volumes" Mar 13 14:22:25 crc kubenswrapper[4898]: I0313 14:22:25.753670 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" path="/var/lib/kubelet/pods/8c6a61ba-babd-4bc2-922a-99b00c2af057/volumes" Mar 13 14:22:26 crc kubenswrapper[4898]: I0313 14:22:26.859746 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:26 crc kubenswrapper[4898]: I0313 14:22:26.913281 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6b86699784-tf822"] Mar 13 14:22:26 crc kubenswrapper[4898]: I0313 14:22:26.913497 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-6b86699784-tf822" podUID="88ab3ad2-782a-4c21-8104-1b80468dbca0" containerName="heat-engine" containerID="cri-o://99ddc8edc7229de6f1b448d188e98b54d68182339989edb36f76125f198ea2d9" gracePeriod=60 Mar 13 14:22:26 crc kubenswrapper[4898]: I0313 14:22:26.956882 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.396303 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qps2v"] Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397057 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485200a5-cd75-45ac-b93a-b003158132c4" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397072 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="485200a5-cd75-45ac-b93a-b003158132c4" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397094 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e516311e-fb5c-4901-aaf7-67793ffb5fa2" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397100 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e516311e-fb5c-4901-aaf7-67793ffb5fa2" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397112 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" containerName="heat-cfnapi" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397118 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" containerName="heat-cfnapi" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397134 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068b0856-126d-487c-9c1d-50299bf90d3a" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397140 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="068b0856-126d-487c-9c1d-50299bf90d3a" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397156 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" containerName="heat-api" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397162 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" containerName="heat-api" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397170 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f1f531-99d1-4b97-bd08-6bf94a7afd92" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397176 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f1f531-99d1-4b97-bd08-6bf94a7afd92" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397184 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ea68d3-f555-4779-90d0-d1f136ddadd2" containerName="init" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397190 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ea68d3-f555-4779-90d0-d1f136ddadd2" containerName="init" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397204 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abedb18-bf27-42d9-b809-f7226b603a0d" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397209 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abedb18-bf27-42d9-b809-f7226b603a0d" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397226 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29dbeb8a-611d-4513-a063-06d8f865ea93" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397232 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="29dbeb8a-611d-4513-a063-06d8f865ea93" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397243 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ea68d3-f555-4779-90d0-d1f136ddadd2" containerName="dnsmasq-dns" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397249 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ea68d3-f555-4779-90d0-d1f136ddadd2" containerName="dnsmasq-dns" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397440 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="485200a5-cd75-45ac-b93a-b003158132c4" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397454 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ea68d3-f555-4779-90d0-d1f136ddadd2" containerName="dnsmasq-dns" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397465 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" containerName="heat-api" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397475 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f1f531-99d1-4b97-bd08-6bf94a7afd92" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397486 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="068b0856-126d-487c-9c1d-50299bf90d3a" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397500 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" containerName="heat-cfnapi" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397510 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="29dbeb8a-611d-4513-a063-06d8f865ea93" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397521 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" containerName="heat-api" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397534 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1abedb18-bf27-42d9-b809-f7226b603a0d" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397543 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" containerName="heat-cfnapi" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397548 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e516311e-fb5c-4901-aaf7-67793ffb5fa2" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.398393 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.404342 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.404407 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.404483 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-42bds" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.437320 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qps2v"] Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.508081 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-scripts\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.508220 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8c7h\" (UniqueName: \"kubernetes.io/projected/7eba407c-68a5-45e9-ab51-e8cba05d8559-kube-api-access-x8c7h\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.508393 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-config-data\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.508457 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.610415 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-config-data\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.610526 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.610563 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-scripts\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.610644 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8c7h\" (UniqueName: \"kubernetes.io/projected/7eba407c-68a5-45e9-ab51-e8cba05d8559-kube-api-access-x8c7h\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.617799 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-scripts\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.618295 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.632038 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-config-data\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.641825 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8c7h\" (UniqueName: \"kubernetes.io/projected/7eba407c-68a5-45e9-ab51-e8cba05d8559-kube-api-access-x8c7h\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.717587 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.963921 4898 generic.go:334] "Generic (PLEG): container finished" podID="e251995e-609a-4f0e-83f3-7f856e58a598" containerID="8aff6294a407bae6a2eb1e2dc4f0f935834538560ddb004e22ea984aea78200b" exitCode=0 Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.964230 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerDied","Data":"8aff6294a407bae6a2eb1e2dc4f0f935834538560ddb004e22ea984aea78200b"} Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.234938 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.329652 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjr2t\" (UniqueName: \"kubernetes.io/projected/e251995e-609a-4f0e-83f3-7f856e58a598-kube-api-access-hjr2t\") pod \"e251995e-609a-4f0e-83f3-7f856e58a598\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.329869 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-sg-core-conf-yaml\") pod \"e251995e-609a-4f0e-83f3-7f856e58a598\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.329892 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-run-httpd\") pod \"e251995e-609a-4f0e-83f3-7f856e58a598\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.330038 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-config-data\") pod \"e251995e-609a-4f0e-83f3-7f856e58a598\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.330272 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-scripts\") pod \"e251995e-609a-4f0e-83f3-7f856e58a598\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.330326 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-log-httpd\") pod \"e251995e-609a-4f0e-83f3-7f856e58a598\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.330478 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e251995e-609a-4f0e-83f3-7f856e58a598" (UID: "e251995e-609a-4f0e-83f3-7f856e58a598"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.330494 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-combined-ca-bundle\") pod \"e251995e-609a-4f0e-83f3-7f856e58a598\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.331439 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e251995e-609a-4f0e-83f3-7f856e58a598" (UID: "e251995e-609a-4f0e-83f3-7f856e58a598"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.332299 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.332326 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.344564 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-scripts" (OuterVolumeSpecName: "scripts") pod "e251995e-609a-4f0e-83f3-7f856e58a598" (UID: "e251995e-609a-4f0e-83f3-7f856e58a598"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.344942 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e251995e-609a-4f0e-83f3-7f856e58a598-kube-api-access-hjr2t" (OuterVolumeSpecName: "kube-api-access-hjr2t") pod "e251995e-609a-4f0e-83f3-7f856e58a598" (UID: "e251995e-609a-4f0e-83f3-7f856e58a598"). InnerVolumeSpecName "kube-api-access-hjr2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.357442 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qps2v"] Mar 13 14:22:29 crc kubenswrapper[4898]: W0313 14:22:29.364069 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7eba407c_68a5_45e9_ab51_e8cba05d8559.slice/crio-ef0f6ea172f6bf26e4477b27da206d5a7254b564e5190d1ac27a9d0f8d70d00d WatchSource:0}: Error finding container ef0f6ea172f6bf26e4477b27da206d5a7254b564e5190d1ac27a9d0f8d70d00d: Status 404 returned error can't find the container with id ef0f6ea172f6bf26e4477b27da206d5a7254b564e5190d1ac27a9d0f8d70d00d Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.379362 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e251995e-609a-4f0e-83f3-7f856e58a598" (UID: "e251995e-609a-4f0e-83f3-7f856e58a598"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.435946 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjr2t\" (UniqueName: \"kubernetes.io/projected/e251995e-609a-4f0e-83f3-7f856e58a598-kube-api-access-hjr2t\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.435990 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.436003 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.450883 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e251995e-609a-4f0e-83f3-7f856e58a598" (UID: "e251995e-609a-4f0e-83f3-7f856e58a598"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.506589 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-config-data" (OuterVolumeSpecName: "config-data") pod "e251995e-609a-4f0e-83f3-7f856e58a598" (UID: "e251995e-609a-4f0e-83f3-7f856e58a598"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.538683 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.538728 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:29 crc kubenswrapper[4898]: E0313 14:22:29.711239 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="99ddc8edc7229de6f1b448d188e98b54d68182339989edb36f76125f198ea2d9" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:22:29 crc kubenswrapper[4898]: E0313 14:22:29.713199 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="99ddc8edc7229de6f1b448d188e98b54d68182339989edb36f76125f198ea2d9" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:22:29 crc kubenswrapper[4898]: E0313 14:22:29.714482 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="99ddc8edc7229de6f1b448d188e98b54d68182339989edb36f76125f198ea2d9" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:22:29 crc kubenswrapper[4898]: E0313 14:22:29.714512 4898 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6b86699784-tf822" podUID="88ab3ad2-782a-4c21-8104-1b80468dbca0" containerName="heat-engine" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.976382 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerDied","Data":"eb628c158bccb3144900d410778ea134ed3ea0eddc185afc79f0a4381f9e188c"} Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.976435 4898 scope.go:117] "RemoveContainer" containerID="f8d1b2554e005cc4ebc5dedc934e5b72d453f242bad5cf70707745dc2f5c1c07" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.976575 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.978344 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qps2v" event={"ID":"7eba407c-68a5-45e9-ab51-e8cba05d8559","Type":"ContainerStarted","Data":"ef0f6ea172f6bf26e4477b27da206d5a7254b564e5190d1ac27a9d0f8d70d00d"} Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.001424 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.011989 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.012677 4898 scope.go:117] "RemoveContainer" containerID="003d77a3bae8b5d30451a8b2b210d256e8f645cab42759f96a2cca43d38b49a5" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.030949 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:30 crc kubenswrapper[4898]: E0313 14:22:30.031469 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="sg-core" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031490 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="sg-core" Mar 13 14:22:30 crc kubenswrapper[4898]: E0313 14:22:30.031510 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" containerName="heat-cfnapi" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031516 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" containerName="heat-cfnapi" Mar 13 14:22:30 crc kubenswrapper[4898]: E0313 14:22:30.031525 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="proxy-httpd" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031531 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="proxy-httpd" Mar 13 14:22:30 crc kubenswrapper[4898]: E0313 14:22:30.031542 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="ceilometer-notification-agent" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031547 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="ceilometer-notification-agent" Mar 13 14:22:30 crc kubenswrapper[4898]: E0313 14:22:30.031564 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" containerName="heat-api" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031570 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" containerName="heat-api" Mar 13 14:22:30 crc kubenswrapper[4898]: E0313 14:22:30.031588 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="ceilometer-central-agent" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031594 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="ceilometer-central-agent" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031791 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="ceilometer-notification-agent" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031808 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="proxy-httpd" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031826 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="sg-core" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031844 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="ceilometer-central-agent" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.033918 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.037961 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.038171 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.041039 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.068938 4898 scope.go:117] "RemoveContainer" containerID="d8d1f76b83bf115de46d0b110c2cf03b3ebde454f4675c453b288cf8a3d3f58a" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.109265 4898 scope.go:117] "RemoveContainer" containerID="8aff6294a407bae6a2eb1e2dc4f0f935834538560ddb004e22ea984aea78200b" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.164085 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-run-httpd\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.164376 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-scripts\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.164506 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx9nv\" (UniqueName: \"kubernetes.io/projected/1458e3e5-908c-4abc-8b47-2b9d08b95100-kube-api-access-bx9nv\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.164552 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-log-httpd\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.164961 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.165204 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-config-data\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.165942 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.268261 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-config-data\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.268603 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.268629 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-run-httpd\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.268656 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-scripts\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.268681 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx9nv\" (UniqueName: \"kubernetes.io/projected/1458e3e5-908c-4abc-8b47-2b9d08b95100-kube-api-access-bx9nv\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.268703 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-log-httpd\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.268750 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.269199 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-run-httpd\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.269450 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-log-httpd\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.274307 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.274357 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.276164 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-scripts\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.291520 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx9nv\" (UniqueName: \"kubernetes.io/projected/1458e3e5-908c-4abc-8b47-2b9d08b95100-kube-api-access-bx9nv\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.305964 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-config-data\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.357435 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.581602 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" probeResult="failure" output=< Mar 13 14:22:30 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:22:30 crc kubenswrapper[4898]: > Mar 13 14:22:31 crc kubenswrapper[4898]: I0313 14:22:31.061474 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:31 crc kubenswrapper[4898]: I0313 14:22:31.743953 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:22:31 crc kubenswrapper[4898]: E0313 14:22:31.744600 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:22:31 crc kubenswrapper[4898]: I0313 14:22:31.774031 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" path="/var/lib/kubelet/pods/e251995e-609a-4f0e-83f3-7f856e58a598/volumes" Mar 13 14:22:32 crc kubenswrapper[4898]: I0313 14:22:32.055256 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerStarted","Data":"1972eec74b79c5eba360234d9352547cca46cb7a3159697f368a323ef349b70f"} Mar 13 14:22:32 crc kubenswrapper[4898]: I0313 14:22:32.055305 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerStarted","Data":"cfca4a4c856812d6447387ed63f92e7c2d0804ab4a50cac7b00ec3d059ab8f3a"} Mar 13 14:22:33 crc kubenswrapper[4898]: I0313 14:22:33.073161 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerStarted","Data":"644027345fec83e47e3898f7dbe5b27fcbb0b059335c0f250370318d012d965d"} Mar 13 14:22:33 crc kubenswrapper[4898]: I0313 14:22:33.137785 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:33 crc kubenswrapper[4898]: I0313 14:22:33.463126 4898 scope.go:117] "RemoveContainer" containerID="309417dd12bdceaad7cc8574de946b3ecc5729e4fa9390a27c026042338454ac" Mar 13 14:22:34 crc kubenswrapper[4898]: I0313 14:22:34.101657 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerStarted","Data":"162b1332b5d9a0da2866acb2884fa7bbe465eb4a392a7cf47768395600f46a91"} Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.136335 4898 generic.go:334] "Generic (PLEG): container finished" podID="88ab3ad2-782a-4c21-8104-1b80468dbca0" containerID="99ddc8edc7229de6f1b448d188e98b54d68182339989edb36f76125f198ea2d9" exitCode=0 Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.136569 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6b86699784-tf822" event={"ID":"88ab3ad2-782a-4c21-8104-1b80468dbca0","Type":"ContainerDied","Data":"99ddc8edc7229de6f1b448d188e98b54d68182339989edb36f76125f198ea2d9"} Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.466477 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.546849 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-combined-ca-bundle\") pod \"88ab3ad2-782a-4c21-8104-1b80468dbca0\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.547059 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snc8w\" (UniqueName: \"kubernetes.io/projected/88ab3ad2-782a-4c21-8104-1b80468dbca0-kube-api-access-snc8w\") pod \"88ab3ad2-782a-4c21-8104-1b80468dbca0\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.547095 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data\") pod \"88ab3ad2-782a-4c21-8104-1b80468dbca0\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.547154 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data-custom\") pod \"88ab3ad2-782a-4c21-8104-1b80468dbca0\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.555674 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "88ab3ad2-782a-4c21-8104-1b80468dbca0" (UID: "88ab3ad2-782a-4c21-8104-1b80468dbca0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.573618 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ab3ad2-782a-4c21-8104-1b80468dbca0-kube-api-access-snc8w" (OuterVolumeSpecName: "kube-api-access-snc8w") pod "88ab3ad2-782a-4c21-8104-1b80468dbca0" (UID: "88ab3ad2-782a-4c21-8104-1b80468dbca0"). InnerVolumeSpecName "kube-api-access-snc8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.599323 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88ab3ad2-782a-4c21-8104-1b80468dbca0" (UID: "88ab3ad2-782a-4c21-8104-1b80468dbca0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.650225 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.650262 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snc8w\" (UniqueName: \"kubernetes.io/projected/88ab3ad2-782a-4c21-8104-1b80468dbca0-kube-api-access-snc8w\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.650278 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.665135 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data" (OuterVolumeSpecName: "config-data") pod "88ab3ad2-782a-4c21-8104-1b80468dbca0" (UID: "88ab3ad2-782a-4c21-8104-1b80468dbca0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.752600 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.169440 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerStarted","Data":"30cd19706b9b763b0b2be58e832d11d7fea738624c0288b8cdeff1b0f4c4df2d"} Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.169778 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.169597 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="sg-core" containerID="cri-o://162b1332b5d9a0da2866acb2884fa7bbe465eb4a392a7cf47768395600f46a91" gracePeriod=30 Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.169516 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="ceilometer-central-agent" containerID="cri-o://1972eec74b79c5eba360234d9352547cca46cb7a3159697f368a323ef349b70f" gracePeriod=30 Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.169628 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="ceilometer-notification-agent" containerID="cri-o://644027345fec83e47e3898f7dbe5b27fcbb0b059335c0f250370318d012d965d" gracePeriod=30 Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.169614 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="proxy-httpd" containerID="cri-o://30cd19706b9b763b0b2be58e832d11d7fea738624c0288b8cdeff1b0f4c4df2d" gracePeriod=30 Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.182577 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6b86699784-tf822" event={"ID":"88ab3ad2-782a-4c21-8104-1b80468dbca0","Type":"ContainerDied","Data":"92091403be154312d0e01cc88ae4975c8ee84b62f142756e9f0eee0701f6969b"} Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.182631 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.182679 4898 scope.go:117] "RemoveContainer" containerID="99ddc8edc7229de6f1b448d188e98b54d68182339989edb36f76125f198ea2d9" Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.210312 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.532069948 podStartE2EDuration="6.210288113s" podCreationTimestamp="2026-03-13 14:22:30 +0000 UTC" firstStartedPulling="2026-03-13 14:22:31.094054539 +0000 UTC m=+1586.095642778" lastFinishedPulling="2026-03-13 14:22:34.772272704 +0000 UTC m=+1589.773860943" observedRunningTime="2026-03-13 14:22:36.195564161 +0000 UTC m=+1591.197152400" watchObservedRunningTime="2026-03-13 14:22:36.210288113 +0000 UTC m=+1591.211876352" Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.241773 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6b86699784-tf822"] Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.258418 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-6b86699784-tf822"] Mar 13 14:22:37 crc kubenswrapper[4898]: I0313 14:22:37.200130 4898 generic.go:334] "Generic (PLEG): container finished" podID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerID="30cd19706b9b763b0b2be58e832d11d7fea738624c0288b8cdeff1b0f4c4df2d" exitCode=0 Mar 13 14:22:37 crc kubenswrapper[4898]: I0313 14:22:37.200369 4898 generic.go:334] "Generic (PLEG): container finished" podID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerID="162b1332b5d9a0da2866acb2884fa7bbe465eb4a392a7cf47768395600f46a91" exitCode=2 Mar 13 14:22:37 crc kubenswrapper[4898]: I0313 14:22:37.200377 4898 generic.go:334] "Generic (PLEG): container finished" podID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerID="644027345fec83e47e3898f7dbe5b27fcbb0b059335c0f250370318d012d965d" exitCode=0 Mar 13 14:22:37 crc kubenswrapper[4898]: I0313 14:22:37.200227 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerDied","Data":"30cd19706b9b763b0b2be58e832d11d7fea738624c0288b8cdeff1b0f4c4df2d"} Mar 13 14:22:37 crc kubenswrapper[4898]: I0313 14:22:37.200410 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerDied","Data":"162b1332b5d9a0da2866acb2884fa7bbe465eb4a392a7cf47768395600f46a91"} Mar 13 14:22:37 crc kubenswrapper[4898]: I0313 14:22:37.200424 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerDied","Data":"644027345fec83e47e3898f7dbe5b27fcbb0b059335c0f250370318d012d965d"} Mar 13 14:22:37 crc kubenswrapper[4898]: I0313 14:22:37.767809 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88ab3ad2-782a-4c21-8104-1b80468dbca0" path="/var/lib/kubelet/pods/88ab3ad2-782a-4c21-8104-1b80468dbca0/volumes" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.231444 4898 generic.go:334] "Generic (PLEG): container finished" podID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerID="1972eec74b79c5eba360234d9352547cca46cb7a3159697f368a323ef349b70f" exitCode=0 Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.231619 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerDied","Data":"1972eec74b79c5eba360234d9352547cca46cb7a3159697f368a323ef349b70f"} Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.503660 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sgxj9"] Mar 13 14:22:39 crc kubenswrapper[4898]: E0313 14:22:39.504375 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ab3ad2-782a-4c21-8104-1b80468dbca0" containerName="heat-engine" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.504397 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ab3ad2-782a-4c21-8104-1b80468dbca0" containerName="heat-engine" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.504698 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ab3ad2-782a-4c21-8104-1b80468dbca0" containerName="heat-engine" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.506935 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.521933 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sgxj9"] Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.591262 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t8zq\" (UniqueName: \"kubernetes.io/projected/df5a6baa-ea65-4b79-b73b-2e1707695c41-kube-api-access-8t8zq\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.591332 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-utilities\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.591518 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-catalog-content\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.693284 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-catalog-content\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.693358 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t8zq\" (UniqueName: \"kubernetes.io/projected/df5a6baa-ea65-4b79-b73b-2e1707695c41-kube-api-access-8t8zq\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.693421 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-utilities\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.693916 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-catalog-content\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.694070 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-utilities\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.718663 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t8zq\" (UniqueName: \"kubernetes.io/projected/df5a6baa-ea65-4b79-b73b-2e1707695c41-kube-api-access-8t8zq\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.856783 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:40 crc kubenswrapper[4898]: I0313 14:22:40.056990 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:22:40 crc kubenswrapper[4898]: I0313 14:22:40.057210 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f772c247-f65b-4185-9c75-25d5894ada70" containerName="glance-log" containerID="cri-o://3ed25c8fea8488646787dd34274ccdeef192849b7dfb335966843f92351f741e" gracePeriod=30 Mar 13 14:22:40 crc kubenswrapper[4898]: I0313 14:22:40.057926 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f772c247-f65b-4185-9c75-25d5894ada70" containerName="glance-httpd" containerID="cri-o://53ad13c5a81b4a9991a57956cff297d785da3080bb5eafedd89860a32e28cd6a" gracePeriod=30 Mar 13 14:22:40 crc kubenswrapper[4898]: I0313 14:22:40.275303 4898 generic.go:334] "Generic (PLEG): container finished" podID="f772c247-f65b-4185-9c75-25d5894ada70" containerID="3ed25c8fea8488646787dd34274ccdeef192849b7dfb335966843f92351f741e" exitCode=143 Mar 13 14:22:40 crc kubenswrapper[4898]: I0313 14:22:40.275347 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f772c247-f65b-4185-9c75-25d5894ada70","Type":"ContainerDied","Data":"3ed25c8fea8488646787dd34274ccdeef192849b7dfb335966843f92351f741e"} Mar 13 14:22:40 crc kubenswrapper[4898]: I0313 14:22:40.595608 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" probeResult="failure" output=< Mar 13 14:22:40 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:22:40 crc kubenswrapper[4898]: > Mar 13 14:22:41 crc kubenswrapper[4898]: I0313 14:22:41.862556 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:22:41 crc kubenswrapper[4898]: I0313 14:22:41.863122 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-log" containerID="cri-o://e53933d3c42586f8c8f9ea54060e1af27f64d65b3366e751904358fe342bcf4d" gracePeriod=30 Mar 13 14:22:41 crc kubenswrapper[4898]: I0313 14:22:41.863657 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-httpd" containerID="cri-o://58217feb23785534a405127b1afdf5fb04709d2e86145617a9cb9b01bf24630f" gracePeriod=30 Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.310761 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qps2v" event={"ID":"7eba407c-68a5-45e9-ab51-e8cba05d8559","Type":"ContainerStarted","Data":"d1ff8d0ca102a074d68ee12cd37ffd04a070a172037a36f9afafa4bd84128371"} Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.321665 4898 generic.go:334] "Generic (PLEG): container finished" podID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerID="e53933d3c42586f8c8f9ea54060e1af27f64d65b3366e751904358fe342bcf4d" exitCode=143 Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.321708 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21","Type":"ContainerDied","Data":"e53933d3c42586f8c8f9ea54060e1af27f64d65b3366e751904358fe342bcf4d"} Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.343530 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qps2v" podStartSLOduration=1.68697618 podStartE2EDuration="14.343514068s" podCreationTimestamp="2026-03-13 14:22:28 +0000 UTC" firstStartedPulling="2026-03-13 14:22:29.366408989 +0000 UTC m=+1584.367997228" lastFinishedPulling="2026-03-13 14:22:42.022946877 +0000 UTC m=+1597.024535116" observedRunningTime="2026-03-13 14:22:42.326293561 +0000 UTC m=+1597.327881810" watchObservedRunningTime="2026-03-13 14:22:42.343514068 +0000 UTC m=+1597.345102307" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.351564 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.467849 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-config-data\") pod \"1458e3e5-908c-4abc-8b47-2b9d08b95100\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.468958 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-log-httpd\") pod \"1458e3e5-908c-4abc-8b47-2b9d08b95100\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.469024 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-scripts\") pod \"1458e3e5-908c-4abc-8b47-2b9d08b95100\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.469053 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-combined-ca-bundle\") pod \"1458e3e5-908c-4abc-8b47-2b9d08b95100\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.469208 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-run-httpd\") pod \"1458e3e5-908c-4abc-8b47-2b9d08b95100\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.469306 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx9nv\" (UniqueName: \"kubernetes.io/projected/1458e3e5-908c-4abc-8b47-2b9d08b95100-kube-api-access-bx9nv\") pod \"1458e3e5-908c-4abc-8b47-2b9d08b95100\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.469376 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-sg-core-conf-yaml\") pod \"1458e3e5-908c-4abc-8b47-2b9d08b95100\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.469831 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1458e3e5-908c-4abc-8b47-2b9d08b95100" (UID: "1458e3e5-908c-4abc-8b47-2b9d08b95100"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.470234 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1458e3e5-908c-4abc-8b47-2b9d08b95100" (UID: "1458e3e5-908c-4abc-8b47-2b9d08b95100"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.470522 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.470539 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.475466 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1458e3e5-908c-4abc-8b47-2b9d08b95100-kube-api-access-bx9nv" (OuterVolumeSpecName: "kube-api-access-bx9nv") pod "1458e3e5-908c-4abc-8b47-2b9d08b95100" (UID: "1458e3e5-908c-4abc-8b47-2b9d08b95100"). InnerVolumeSpecName "kube-api-access-bx9nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.482045 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-scripts" (OuterVolumeSpecName: "scripts") pod "1458e3e5-908c-4abc-8b47-2b9d08b95100" (UID: "1458e3e5-908c-4abc-8b47-2b9d08b95100"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.549046 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1458e3e5-908c-4abc-8b47-2b9d08b95100" (UID: "1458e3e5-908c-4abc-8b47-2b9d08b95100"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.580293 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx9nv\" (UniqueName: \"kubernetes.io/projected/1458e3e5-908c-4abc-8b47-2b9d08b95100-kube-api-access-bx9nv\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.580353 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.580368 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.586007 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1458e3e5-908c-4abc-8b47-2b9d08b95100" (UID: "1458e3e5-908c-4abc-8b47-2b9d08b95100"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.676532 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sgxj9"] Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.683609 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.701161 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-config-data" (OuterVolumeSpecName: "config-data") pod "1458e3e5-908c-4abc-8b47-2b9d08b95100" (UID: "1458e3e5-908c-4abc-8b47-2b9d08b95100"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.740732 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:22:42 crc kubenswrapper[4898]: E0313 14:22:42.741037 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.785878 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.333336 4898 generic.go:334] "Generic (PLEG): container finished" podID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerID="d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606" exitCode=0 Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.333624 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgxj9" event={"ID":"df5a6baa-ea65-4b79-b73b-2e1707695c41","Type":"ContainerDied","Data":"d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606"} Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.333652 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgxj9" event={"ID":"df5a6baa-ea65-4b79-b73b-2e1707695c41","Type":"ContainerStarted","Data":"c4f29c1c84db31109e4a65fbd939345ced5fa3e54cbf69234fc3c9374e91653f"} Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.337400 4898 generic.go:334] "Generic (PLEG): container finished" podID="f772c247-f65b-4185-9c75-25d5894ada70" containerID="53ad13c5a81b4a9991a57956cff297d785da3080bb5eafedd89860a32e28cd6a" exitCode=0 Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.337478 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f772c247-f65b-4185-9c75-25d5894ada70","Type":"ContainerDied","Data":"53ad13c5a81b4a9991a57956cff297d785da3080bb5eafedd89860a32e28cd6a"} Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.342377 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerDied","Data":"cfca4a4c856812d6447387ed63f92e7c2d0804ab4a50cac7b00ec3d059ab8f3a"} Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.342457 4898 scope.go:117] "RemoveContainer" containerID="30cd19706b9b763b0b2be58e832d11d7fea738624c0288b8cdeff1b0f4c4df2d" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.342478 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.390504 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.391030 4898 scope.go:117] "RemoveContainer" containerID="162b1332b5d9a0da2866acb2884fa7bbe465eb4a392a7cf47768395600f46a91" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.404096 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430028 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:43 crc kubenswrapper[4898]: E0313 14:22:43.430515 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="proxy-httpd" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430538 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="proxy-httpd" Mar 13 14:22:43 crc kubenswrapper[4898]: E0313 14:22:43.430576 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="ceilometer-central-agent" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430583 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="ceilometer-central-agent" Mar 13 14:22:43 crc kubenswrapper[4898]: E0313 14:22:43.430595 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="ceilometer-notification-agent" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430601 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="ceilometer-notification-agent" Mar 13 14:22:43 crc kubenswrapper[4898]: E0313 14:22:43.430621 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="sg-core" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430627 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="sg-core" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430821 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="sg-core" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430842 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="ceilometer-notification-agent" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430853 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="proxy-httpd" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430864 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="ceilometer-central-agent" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.435381 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.437657 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.438064 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.451336 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.467015 4898 scope.go:117] "RemoveContainer" containerID="644027345fec83e47e3898f7dbe5b27fcbb0b059335c0f250370318d012d965d" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.500485 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-scripts\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.500522 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.500568 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.500624 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-674qw\" (UniqueName: \"kubernetes.io/projected/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-kube-api-access-674qw\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.500669 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-log-httpd\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.500755 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-config-data\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.500832 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-run-httpd\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.507635 4898 scope.go:117] "RemoveContainer" containerID="1972eec74b79c5eba360234d9352547cca46cb7a3159697f368a323ef349b70f" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.602958 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-run-httpd\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.603238 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.603260 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-scripts\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.603288 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.603374 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-674qw\" (UniqueName: \"kubernetes.io/projected/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-kube-api-access-674qw\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.603422 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-log-httpd\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.603577 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-config-data\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.604358 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-run-httpd\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.604673 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-log-httpd\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.611932 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-config-data\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.612959 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.621413 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.623815 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-674qw\" (UniqueName: \"kubernetes.io/projected/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-kube-api-access-674qw\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.635003 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-scripts\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.764815 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" path="/var/lib/kubelet/pods/1458e3e5-908c-4abc-8b47-2b9d08b95100/volumes" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.767526 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.892290 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.021132 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"f772c247-f65b-4185-9c75-25d5894ada70\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.021282 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-logs\") pod \"f772c247-f65b-4185-9c75-25d5894ada70\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.021346 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-scripts\") pod \"f772c247-f65b-4185-9c75-25d5894ada70\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.021457 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-config-data\") pod \"f772c247-f65b-4185-9c75-25d5894ada70\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.021499 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-public-tls-certs\") pod \"f772c247-f65b-4185-9c75-25d5894ada70\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.021572 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wpcb\" (UniqueName: \"kubernetes.io/projected/f772c247-f65b-4185-9c75-25d5894ada70-kube-api-access-4wpcb\") pod \"f772c247-f65b-4185-9c75-25d5894ada70\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.021631 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-httpd-run\") pod \"f772c247-f65b-4185-9c75-25d5894ada70\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.021665 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-combined-ca-bundle\") pod \"f772c247-f65b-4185-9c75-25d5894ada70\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.023103 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-logs" (OuterVolumeSpecName: "logs") pod "f772c247-f65b-4185-9c75-25d5894ada70" (UID: "f772c247-f65b-4185-9c75-25d5894ada70"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.023463 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f772c247-f65b-4185-9c75-25d5894ada70" (UID: "f772c247-f65b-4185-9c75-25d5894ada70"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.031158 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-scripts" (OuterVolumeSpecName: "scripts") pod "f772c247-f65b-4185-9c75-25d5894ada70" (UID: "f772c247-f65b-4185-9c75-25d5894ada70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.045176 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f772c247-f65b-4185-9c75-25d5894ada70-kube-api-access-4wpcb" (OuterVolumeSpecName: "kube-api-access-4wpcb") pod "f772c247-f65b-4185-9c75-25d5894ada70" (UID: "f772c247-f65b-4185-9c75-25d5894ada70"). InnerVolumeSpecName "kube-api-access-4wpcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.068317 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7" (OuterVolumeSpecName: "glance") pod "f772c247-f65b-4185-9c75-25d5894ada70" (UID: "f772c247-f65b-4185-9c75-25d5894ada70"). InnerVolumeSpecName "pvc-17b3b094-1a55-406a-a787-e0abb588e5b7". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.081730 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f772c247-f65b-4185-9c75-25d5894ada70" (UID: "f772c247-f65b-4185-9c75-25d5894ada70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.122917 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f772c247-f65b-4185-9c75-25d5894ada70" (UID: "f772c247-f65b-4185-9c75-25d5894ada70"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.124650 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wpcb\" (UniqueName: \"kubernetes.io/projected/f772c247-f65b-4185-9c75-25d5894ada70-kube-api-access-4wpcb\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.124685 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.124697 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.124729 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") on node \"crc\" " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.124741 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.124751 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.124760 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.146026 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-config-data" (OuterVolumeSpecName: "config-data") pod "f772c247-f65b-4185-9c75-25d5894ada70" (UID: "f772c247-f65b-4185-9c75-25d5894ada70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.165174 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.165702 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-17b3b094-1a55-406a-a787-e0abb588e5b7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7") on node "crc" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.227091 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.227128 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.340529 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.353661 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerStarted","Data":"bdcbd848029858f4c387dbe27ce9e5d245b65833296202afd31da1583743aa4c"} Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.356542 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f772c247-f65b-4185-9c75-25d5894ada70","Type":"ContainerDied","Data":"ddd1664b14e1ff4c8657d63bc705f6e2cc8530fd54bcfec783c314238117e1e0"} Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.356590 4898 scope.go:117] "RemoveContainer" containerID="53ad13c5a81b4a9991a57956cff297d785da3080bb5eafedd89860a32e28cd6a" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.356698 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.384199 4898 scope.go:117] "RemoveContainer" containerID="3ed25c8fea8488646787dd34274ccdeef192849b7dfb335966843f92351f741e" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.405486 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.419820 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.442375 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:22:44 crc kubenswrapper[4898]: E0313 14:22:44.492693 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f772c247-f65b-4185-9c75-25d5894ada70" containerName="glance-httpd" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.497194 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f772c247-f65b-4185-9c75-25d5894ada70" containerName="glance-httpd" Mar 13 14:22:44 crc kubenswrapper[4898]: E0313 14:22:44.497467 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f772c247-f65b-4185-9c75-25d5894ada70" containerName="glance-log" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.497521 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f772c247-f65b-4185-9c75-25d5894ada70" containerName="glance-log" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.498877 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f772c247-f65b-4185-9c75-25d5894ada70" containerName="glance-log" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.499040 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f772c247-f65b-4185-9c75-25d5894ada70" containerName="glance-httpd" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.503941 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.504243 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.506170 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.507498 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.580364 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.667705 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l2c5\" (UniqueName: \"kubernetes.io/projected/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-kube-api-access-7l2c5\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.667871 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.667944 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.668023 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-scripts\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.668089 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-config-data\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.668180 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.668254 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-logs\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.668282 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.770357 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.770448 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-logs\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.770473 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.770515 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l2c5\" (UniqueName: \"kubernetes.io/projected/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-kube-api-access-7l2c5\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.770548 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.770580 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.770629 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-scripts\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.770674 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-config-data\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.771671 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-logs\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.772280 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.775435 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.776156 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1810628263decbcb8d9790a46f0a2a80fe37ecdd6e2a4c05137bd112c0de5f67/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.777262 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-scripts\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.782882 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-config-data\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.790658 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l2c5\" (UniqueName: \"kubernetes.io/projected/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-kube-api-access-7l2c5\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.792847 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.806675 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.856774 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:45 crc kubenswrapper[4898]: I0313 14:22:45.137636 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:22:45 crc kubenswrapper[4898]: I0313 14:22:45.428192 4898 generic.go:334] "Generic (PLEG): container finished" podID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerID="58217feb23785534a405127b1afdf5fb04709d2e86145617a9cb9b01bf24630f" exitCode=0 Mar 13 14:22:45 crc kubenswrapper[4898]: I0313 14:22:45.428542 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21","Type":"ContainerDied","Data":"58217feb23785534a405127b1afdf5fb04709d2e86145617a9cb9b01bf24630f"} Mar 13 14:22:45 crc kubenswrapper[4898]: I0313 14:22:45.444523 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerStarted","Data":"fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624"} Mar 13 14:22:45 crc kubenswrapper[4898]: I0313 14:22:45.447371 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgxj9" event={"ID":"df5a6baa-ea65-4b79-b73b-2e1707695c41","Type":"ContainerStarted","Data":"dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad"} Mar 13 14:22:45 crc kubenswrapper[4898]: I0313 14:22:45.762462 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f772c247-f65b-4185-9c75-25d5894ada70" path="/var/lib/kubelet/pods/f772c247-f65b-4185-9c75-25d5894ada70/volumes" Mar 13 14:22:45 crc kubenswrapper[4898]: I0313 14:22:45.923662 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.013081 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-internal-tls-certs\") pod \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.013417 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-httpd-run\") pod \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.013449 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9fs8\" (UniqueName: \"kubernetes.io/projected/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-kube-api-access-n9fs8\") pod \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.014623 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" (UID: "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.024256 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-kube-api-access-n9fs8" (OuterVolumeSpecName: "kube-api-access-n9fs8") pod "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" (UID: "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21"). InnerVolumeSpecName "kube-api-access-n9fs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.042527 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.042594 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-config-data\") pod \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.042648 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-scripts\") pod \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.042697 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-logs\") pod \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.042834 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-combined-ca-bundle\") pod \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.044031 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.044049 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9fs8\" (UniqueName: \"kubernetes.io/projected/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-kube-api-access-n9fs8\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.044327 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-logs" (OuterVolumeSpecName: "logs") pod "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" (UID: "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.063217 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.082096 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-scripts" (OuterVolumeSpecName: "scripts") pod "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" (UID: "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.146263 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.146291 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.177943 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3" (OuterVolumeSpecName: "glance") pod "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" (UID: "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21"). InnerVolumeSpecName "pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.180943 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" (UID: "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.193009 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-config-data" (OuterVolumeSpecName: "config-data") pod "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" (UID: "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.195286 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" (UID: "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.248497 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") on node \"crc\" " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.248536 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.248547 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.248558 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.291920 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.292069 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3") on node "crc" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.351156 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.461678 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21","Type":"ContainerDied","Data":"8cae8ab663d1286eb25519e52a961629328c7b2280f2d0209ef1752767b57b37"} Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.461718 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.461762 4898 scope.go:117] "RemoveContainer" containerID="58217feb23785534a405127b1afdf5fb04709d2e86145617a9cb9b01bf24630f" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.464249 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a7cdbc1c-79cc-441b-a08c-c61b717d82c9","Type":"ContainerStarted","Data":"a3bce51beac47756082a84fa10e108e6da1b8f25f5395f721d74dd749bc49b65"} Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.468170 4898 generic.go:334] "Generic (PLEG): container finished" podID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerID="dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad" exitCode=0 Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.468221 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgxj9" event={"ID":"df5a6baa-ea65-4b79-b73b-2e1707695c41","Type":"ContainerDied","Data":"dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad"} Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.520433 4898 scope.go:117] "RemoveContainer" containerID="e53933d3c42586f8c8f9ea54060e1af27f64d65b3366e751904358fe342bcf4d" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.576474 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.576527 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.576542 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:22:46 crc kubenswrapper[4898]: E0313 14:22:46.577066 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-httpd" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.577080 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-httpd" Mar 13 14:22:46 crc kubenswrapper[4898]: E0313 14:22:46.577096 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-log" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.577104 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-log" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.577381 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-log" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.577412 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-httpd" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.578683 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.590201 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.599347 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.599539 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.701193 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.701270 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.701388 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.701436 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.701525 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.701620 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.701683 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.701730 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qz6d\" (UniqueName: \"kubernetes.io/projected/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-kube-api-access-8qz6d\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.803993 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.804088 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.804124 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qz6d\" (UniqueName: \"kubernetes.io/projected/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-kube-api-access-8qz6d\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.804213 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.804233 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.804315 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.804371 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.804427 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.806078 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.811212 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.813773 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.814983 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.815567 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.815590 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e32bfaa798492a506ddfd6dd81603c6b252f4a286c98ba8256226389647f45c3/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.831642 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.832186 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qz6d\" (UniqueName: \"kubernetes.io/projected/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-kube-api-access-8qz6d\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.833036 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.901450 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.923441 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:47 crc kubenswrapper[4898]: I0313 14:22:47.502862 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgxj9" event={"ID":"df5a6baa-ea65-4b79-b73b-2e1707695c41","Type":"ContainerStarted","Data":"6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38"} Mar 13 14:22:47 crc kubenswrapper[4898]: I0313 14:22:47.515480 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerStarted","Data":"3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de"} Mar 13 14:22:47 crc kubenswrapper[4898]: I0313 14:22:47.515825 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerStarted","Data":"b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00"} Mar 13 14:22:47 crc kubenswrapper[4898]: I0313 14:22:47.529113 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a7cdbc1c-79cc-441b-a08c-c61b717d82c9","Type":"ContainerStarted","Data":"4265dd0507c2704781116d72355561e591bf0123855e74ed893f542adfdb719b"} Mar 13 14:22:47 crc kubenswrapper[4898]: I0313 14:22:47.535659 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sgxj9" podStartSLOduration=4.85974153 podStartE2EDuration="8.535641134s" podCreationTimestamp="2026-03-13 14:22:39 +0000 UTC" firstStartedPulling="2026-03-13 14:22:43.336176708 +0000 UTC m=+1598.337764947" lastFinishedPulling="2026-03-13 14:22:47.012076322 +0000 UTC m=+1602.013664551" observedRunningTime="2026-03-13 14:22:47.525132131 +0000 UTC m=+1602.526720370" watchObservedRunningTime="2026-03-13 14:22:47.535641134 +0000 UTC m=+1602.537229373" Mar 13 14:22:47 crc kubenswrapper[4898]: I0313 14:22:47.607590 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:22:47 crc kubenswrapper[4898]: I0313 14:22:47.752546 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" path="/var/lib/kubelet/pods/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21/volumes" Mar 13 14:22:48 crc kubenswrapper[4898]: I0313 14:22:48.548854 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a7cdbc1c-79cc-441b-a08c-c61b717d82c9","Type":"ContainerStarted","Data":"66c8a09078700facbed2482f14fad615c3281038c45bea626f2eb1f6fe913934"} Mar 13 14:22:48 crc kubenswrapper[4898]: I0313 14:22:48.561599 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f666d519-2c39-4e93-823d-e5a3fcfd0d5a","Type":"ContainerStarted","Data":"c71998f7e85a12ae7fd0bfb2e9d6340902dbc3212be784e3b0a7b3bb1eb85daa"} Mar 13 14:22:48 crc kubenswrapper[4898]: I0313 14:22:48.561650 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f666d519-2c39-4e93-823d-e5a3fcfd0d5a","Type":"ContainerStarted","Data":"d23acfbc3ba71b9594b68688692fbc4ddeaf3360f98c7f712571a80019e0d00c"} Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.572660 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.573195 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerStarted","Data":"322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d"} Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.573255 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.573255 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="ceilometer-central-agent" containerID="cri-o://fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624" gracePeriod=30 Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.573294 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="proxy-httpd" containerID="cri-o://322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d" gracePeriod=30 Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.573270 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="ceilometer-notification-agent" containerID="cri-o://b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00" gracePeriod=30 Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.573415 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="sg-core" containerID="cri-o://3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de" gracePeriod=30 Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.576039 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f666d519-2c39-4e93-823d-e5a3fcfd0d5a","Type":"ContainerStarted","Data":"edd7f8a46e26c03cdc0e5e2c01acb1a698e0cea2d01ef6aedd55274949e181d8"} Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.593332 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.593310449 podStartE2EDuration="5.593310449s" podCreationTimestamp="2026-03-13 14:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:48.574872012 +0000 UTC m=+1603.576460261" watchObservedRunningTime="2026-03-13 14:22:49.593310449 +0000 UTC m=+1604.594898688" Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.654390 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.654371325 podStartE2EDuration="3.654371325s" podCreationTimestamp="2026-03-13 14:22:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:49.625470734 +0000 UTC m=+1604.627058993" watchObservedRunningTime="2026-03-13 14:22:49.654371325 +0000 UTC m=+1604.655959564" Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.671442 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.714264 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.042093541 podStartE2EDuration="6.714246509s" podCreationTimestamp="2026-03-13 14:22:43 +0000 UTC" firstStartedPulling="2026-03-13 14:22:44.335485089 +0000 UTC m=+1599.337073328" lastFinishedPulling="2026-03-13 14:22:49.007638057 +0000 UTC m=+1604.009226296" observedRunningTime="2026-03-13 14:22:49.64919448 +0000 UTC m=+1604.650782739" watchObservedRunningTime="2026-03-13 14:22:49.714246509 +0000 UTC m=+1604.715834748" Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.817017 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z7ldc"] Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.860990 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.861240 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.924432 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:50 crc kubenswrapper[4898]: I0313 14:22:50.589922 4898 generic.go:334] "Generic (PLEG): container finished" podID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerID="322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d" exitCode=0 Mar 13 14:22:50 crc kubenswrapper[4898]: I0313 14:22:50.590223 4898 generic.go:334] "Generic (PLEG): container finished" podID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerID="3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de" exitCode=2 Mar 13 14:22:50 crc kubenswrapper[4898]: I0313 14:22:50.590236 4898 generic.go:334] "Generic (PLEG): container finished" podID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerID="b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00" exitCode=0 Mar 13 14:22:50 crc kubenswrapper[4898]: I0313 14:22:50.591365 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerDied","Data":"322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d"} Mar 13 14:22:50 crc kubenswrapper[4898]: I0313 14:22:50.591410 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerDied","Data":"3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de"} Mar 13 14:22:50 crc kubenswrapper[4898]: I0313 14:22:50.591427 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerDied","Data":"b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00"} Mar 13 14:22:51 crc kubenswrapper[4898]: I0313 14:22:51.603075 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" containerID="cri-o://3cb8ede7b2e1e9a6a6b4976b023c84903fe921e5d4e530d62aef69fe59b03a0a" gracePeriod=2 Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.614668 4898 generic.go:334] "Generic (PLEG): container finished" podID="b38f3681-6f2f-437f-9694-810d43921aa2" containerID="3cb8ede7b2e1e9a6a6b4976b023c84903fe921e5d4e530d62aef69fe59b03a0a" exitCode=0 Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.616029 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7ldc" event={"ID":"b38f3681-6f2f-437f-9694-810d43921aa2","Type":"ContainerDied","Data":"3cb8ede7b2e1e9a6a6b4976b023c84903fe921e5d4e530d62aef69fe59b03a0a"} Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.782760 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.863734 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-catalog-content\") pod \"b38f3681-6f2f-437f-9694-810d43921aa2\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.863812 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2469\" (UniqueName: \"kubernetes.io/projected/b38f3681-6f2f-437f-9694-810d43921aa2-kube-api-access-v2469\") pod \"b38f3681-6f2f-437f-9694-810d43921aa2\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.864180 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-utilities\") pod \"b38f3681-6f2f-437f-9694-810d43921aa2\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.864922 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-utilities" (OuterVolumeSpecName: "utilities") pod "b38f3681-6f2f-437f-9694-810d43921aa2" (UID: "b38f3681-6f2f-437f-9694-810d43921aa2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.881601 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b38f3681-6f2f-437f-9694-810d43921aa2-kube-api-access-v2469" (OuterVolumeSpecName: "kube-api-access-v2469") pod "b38f3681-6f2f-437f-9694-810d43921aa2" (UID: "b38f3681-6f2f-437f-9694-810d43921aa2"). InnerVolumeSpecName "kube-api-access-v2469". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.967367 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2469\" (UniqueName: \"kubernetes.io/projected/b38f3681-6f2f-437f-9694-810d43921aa2-kube-api-access-v2469\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.967401 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.989019 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b38f3681-6f2f-437f-9694-810d43921aa2" (UID: "b38f3681-6f2f-437f-9694-810d43921aa2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.071756 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.628948 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7ldc" event={"ID":"b38f3681-6f2f-437f-9694-810d43921aa2","Type":"ContainerDied","Data":"14f6fca3b948b0af4024656eeff03f1e8c8fe427562c97293c95f1bdd3284d24"} Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.629056 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.630417 4898 scope.go:117] "RemoveContainer" containerID="3cb8ede7b2e1e9a6a6b4976b023c84903fe921e5d4e530d62aef69fe59b03a0a" Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.663140 4898 scope.go:117] "RemoveContainer" containerID="c5557297cdf8c10622794d499e8dd04fb952d7400a53b9ebeaf103b83d901e50" Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.675614 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z7ldc"] Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.686435 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z7ldc"] Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.708824 4898 scope.go:117] "RemoveContainer" containerID="d6ed263f1fe660123646c8c6128f780dbe747c9b3a543fa08475d3acfc1517d5" Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.769102 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" path="/var/lib/kubelet/pods/b38f3681-6f2f-437f-9694-810d43921aa2/volumes" Mar 13 14:22:55 crc kubenswrapper[4898]: I0313 14:22:55.137867 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 14:22:55 crc kubenswrapper[4898]: I0313 14:22:55.138197 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 14:22:55 crc kubenswrapper[4898]: I0313 14:22:55.185628 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 14:22:55 crc kubenswrapper[4898]: I0313 14:22:55.224175 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 14:22:55 crc kubenswrapper[4898]: I0313 14:22:55.652142 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 14:22:55 crc kubenswrapper[4898]: I0313 14:22:55.652187 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 14:22:55 crc kubenswrapper[4898]: I0313 14:22:55.752549 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:22:55 crc kubenswrapper[4898]: E0313 14:22:55.752836 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:22:56 crc kubenswrapper[4898]: I0313 14:22:56.668871 4898 generic.go:334] "Generic (PLEG): container finished" podID="7eba407c-68a5-45e9-ab51-e8cba05d8559" containerID="d1ff8d0ca102a074d68ee12cd37ffd04a070a172037a36f9afafa4bd84128371" exitCode=0 Mar 13 14:22:56 crc kubenswrapper[4898]: I0313 14:22:56.668935 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qps2v" event={"ID":"7eba407c-68a5-45e9-ab51-e8cba05d8559","Type":"ContainerDied","Data":"d1ff8d0ca102a074d68ee12cd37ffd04a070a172037a36f9afafa4bd84128371"} Mar 13 14:22:56 crc kubenswrapper[4898]: I0313 14:22:56.924530 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:56 crc kubenswrapper[4898]: I0313 14:22:56.924598 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:56 crc kubenswrapper[4898]: I0313 14:22:56.973396 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:56 crc kubenswrapper[4898]: I0313 14:22:56.976562 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:57 crc kubenswrapper[4898]: I0313 14:22:57.681719 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:57 crc kubenswrapper[4898]: I0313 14:22:57.682046 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.214937 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.311593 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-scripts\") pod \"7eba407c-68a5-45e9-ab51-e8cba05d8559\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.311912 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-combined-ca-bundle\") pod \"7eba407c-68a5-45e9-ab51-e8cba05d8559\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.311985 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8c7h\" (UniqueName: \"kubernetes.io/projected/7eba407c-68a5-45e9-ab51-e8cba05d8559-kube-api-access-x8c7h\") pod \"7eba407c-68a5-45e9-ab51-e8cba05d8559\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.312136 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-config-data\") pod \"7eba407c-68a5-45e9-ab51-e8cba05d8559\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.318078 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-scripts" (OuterVolumeSpecName: "scripts") pod "7eba407c-68a5-45e9-ab51-e8cba05d8559" (UID: "7eba407c-68a5-45e9-ab51-e8cba05d8559"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.381084 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eba407c-68a5-45e9-ab51-e8cba05d8559-kube-api-access-x8c7h" (OuterVolumeSpecName: "kube-api-access-x8c7h") pod "7eba407c-68a5-45e9-ab51-e8cba05d8559" (UID: "7eba407c-68a5-45e9-ab51-e8cba05d8559"). InnerVolumeSpecName "kube-api-access-x8c7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.402976 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-config-data" (OuterVolumeSpecName: "config-data") pod "7eba407c-68a5-45e9-ab51-e8cba05d8559" (UID: "7eba407c-68a5-45e9-ab51-e8cba05d8559"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.418280 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8c7h\" (UniqueName: \"kubernetes.io/projected/7eba407c-68a5-45e9-ab51-e8cba05d8559-kube-api-access-x8c7h\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.418319 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.418332 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.439933 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7eba407c-68a5-45e9-ab51-e8cba05d8559" (UID: "7eba407c-68a5-45e9-ab51-e8cba05d8559"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.520756 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.579141 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.694576 4898 generic.go:334] "Generic (PLEG): container finished" podID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerID="fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624" exitCode=0 Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.694638 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerDied","Data":"fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624"} Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.694665 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerDied","Data":"bdcbd848029858f4c387dbe27ce9e5d245b65833296202afd31da1583743aa4c"} Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.694684 4898 scope.go:117] "RemoveContainer" containerID="322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.694860 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.696582 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qps2v" event={"ID":"7eba407c-68a5-45e9-ab51-e8cba05d8559","Type":"ContainerDied","Data":"ef0f6ea172f6bf26e4477b27da206d5a7254b564e5190d1ac27a9d0f8d70d00d"} Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.696610 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef0f6ea172f6bf26e4477b27da206d5a7254b564e5190d1ac27a9d0f8d70d00d" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.696644 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.722192 4898 scope.go:117] "RemoveContainer" containerID="3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.726110 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-sg-core-conf-yaml\") pod \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.726406 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-config-data\") pod \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.726529 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-run-httpd\") pod \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.726599 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-combined-ca-bundle\") pod \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.726715 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-674qw\" (UniqueName: \"kubernetes.io/projected/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-kube-api-access-674qw\") pod \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.727696 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-scripts\") pod \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.727798 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-log-httpd\") pod \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.730021 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "37ab1f60-9ee0-4d70-9730-f17c9feafaeb" (UID: "37ab1f60-9ee0-4d70-9730-f17c9feafaeb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.735753 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "37ab1f60-9ee0-4d70-9730-f17c9feafaeb" (UID: "37ab1f60-9ee0-4d70-9730-f17c9feafaeb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.746156 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-scripts" (OuterVolumeSpecName: "scripts") pod "37ab1f60-9ee0-4d70-9730-f17c9feafaeb" (UID: "37ab1f60-9ee0-4d70-9730-f17c9feafaeb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.752467 4898 scope.go:117] "RemoveContainer" containerID="b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.759952 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-kube-api-access-674qw" (OuterVolumeSpecName: "kube-api-access-674qw") pod "37ab1f60-9ee0-4d70-9730-f17c9feafaeb" (UID: "37ab1f60-9ee0-4d70-9730-f17c9feafaeb"). InnerVolumeSpecName "kube-api-access-674qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.782965 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "37ab1f60-9ee0-4d70-9730-f17c9feafaeb" (UID: "37ab1f60-9ee0-4d70-9730-f17c9feafaeb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.795923 4898 scope.go:117] "RemoveContainer" containerID="fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810012 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.810553 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810565 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.810587 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eba407c-68a5-45e9-ab51-e8cba05d8559" containerName="nova-cell0-conductor-db-sync" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810596 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eba407c-68a5-45e9-ab51-e8cba05d8559" containerName="nova-cell0-conductor-db-sync" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.810612 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="extract-utilities" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810619 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="extract-utilities" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.810635 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="ceilometer-central-agent" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810641 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="ceilometer-central-agent" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.810657 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="sg-core" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810664 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="sg-core" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.810677 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="ceilometer-notification-agent" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810682 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="ceilometer-notification-agent" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.810695 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="extract-content" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810701 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="extract-content" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.810722 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="proxy-httpd" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810727 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="proxy-httpd" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810951 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="proxy-httpd" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810974 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="sg-core" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810980 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="ceilometer-notification-agent" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810986 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="ceilometer-central-agent" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810997 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.811013 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eba407c-68a5-45e9-ab51-e8cba05d8559" containerName="nova-cell0-conductor-db-sync" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.811839 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.814782 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.815061 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-42bds" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.821920 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.832157 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.832181 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.832191 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.832199 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.832207 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-674qw\" (UniqueName: \"kubernetes.io/projected/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-kube-api-access-674qw\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.842544 4898 scope.go:117] "RemoveContainer" containerID="322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.843229 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d\": container with ID starting with 322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d not found: ID does not exist" containerID="322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.843275 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d"} err="failed to get container status \"322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d\": rpc error: code = NotFound desc = could not find container \"322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d\": container with ID starting with 322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d not found: ID does not exist" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.843301 4898 scope.go:117] "RemoveContainer" containerID="3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.843525 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de\": container with ID starting with 3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de not found: ID does not exist" containerID="3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.843548 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de"} err="failed to get container status \"3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de\": rpc error: code = NotFound desc = could not find container \"3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de\": container with ID starting with 3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de not found: ID does not exist" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.843561 4898 scope.go:117] "RemoveContainer" containerID="b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.843735 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00\": container with ID starting with b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00 not found: ID does not exist" containerID="b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.843759 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00"} err="failed to get container status \"b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00\": rpc error: code = NotFound desc = could not find container \"b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00\": container with ID starting with b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00 not found: ID does not exist" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.843777 4898 scope.go:117] "RemoveContainer" containerID="fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.844009 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624\": container with ID starting with fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624 not found: ID does not exist" containerID="fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.844034 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624"} err="failed to get container status \"fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624\": rpc error: code = NotFound desc = could not find container \"fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624\": container with ID starting with fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624 not found: ID does not exist" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.848976 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37ab1f60-9ee0-4d70-9730-f17c9feafaeb" (UID: "37ab1f60-9ee0-4d70-9730-f17c9feafaeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.894640 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-config-data" (OuterVolumeSpecName: "config-data") pod "37ab1f60-9ee0-4d70-9730-f17c9feafaeb" (UID: "37ab1f60-9ee0-4d70-9730-f17c9feafaeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.933622 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbnf9\" (UniqueName: \"kubernetes.io/projected/9796fb40-37f0-4d8a-929f-4bb6295388a4-kube-api-access-tbnf9\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.933823 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9796fb40-37f0-4d8a-929f-4bb6295388a4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.933859 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9796fb40-37f0-4d8a-929f-4bb6295388a4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.934439 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.934524 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.027913 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.037014 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbnf9\" (UniqueName: \"kubernetes.io/projected/9796fb40-37f0-4d8a-929f-4bb6295388a4-kube-api-access-tbnf9\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.037217 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9796fb40-37f0-4d8a-929f-4bb6295388a4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.037248 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9796fb40-37f0-4d8a-929f-4bb6295388a4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.038802 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.049719 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9796fb40-37f0-4d8a-929f-4bb6295388a4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.049841 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9796fb40-37f0-4d8a-929f-4bb6295388a4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.059580 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.062617 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.064963 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.068448 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.070683 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbnf9\" (UniqueName: \"kubernetes.io/projected/9796fb40-37f0-4d8a-929f-4bb6295388a4-kube-api-access-tbnf9\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.103195 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.103295 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.106199 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.144340 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.158176 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.276356 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.276706 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsfjl\" (UniqueName: \"kubernetes.io/projected/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-kube-api-access-bsfjl\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.277017 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-scripts\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.291139 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.291264 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-run-httpd\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.291310 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-config-data\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.292873 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-log-httpd\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.398829 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.399252 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsfjl\" (UniqueName: \"kubernetes.io/projected/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-kube-api-access-bsfjl\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.399291 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-scripts\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.399348 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.399380 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-run-httpd\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.399400 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-config-data\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.399426 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-log-httpd\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.399959 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-log-httpd\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.410240 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.410761 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-run-httpd\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.412910 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-scripts\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.421795 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-config-data\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.429098 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsfjl\" (UniqueName: \"kubernetes.io/projected/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-kube-api-access-bsfjl\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.436563 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.641292 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.699852 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 14:22:59 crc kubenswrapper[4898]: W0313 14:22:59.710656 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9796fb40_37f0_4d8a_929f_4bb6295388a4.slice/crio-ee07bf8d012fcf6aefa9d877ab61e51a9ab17008f8e280eddbbc403dcc107dba WatchSource:0}: Error finding container ee07bf8d012fcf6aefa9d877ab61e51a9ab17008f8e280eddbbc403dcc107dba: Status 404 returned error can't find the container with id ee07bf8d012fcf6aefa9d877ab61e51a9ab17008f8e280eddbbc403dcc107dba Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.715276 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.715300 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.758586 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" path="/var/lib/kubelet/pods/37ab1f60-9ee0-4d70-9730-f17c9feafaeb/volumes" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.934246 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.996341 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sgxj9"] Mar 13 14:23:00 crc kubenswrapper[4898]: I0313 14:23:00.129928 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:00 crc kubenswrapper[4898]: W0313 14:23:00.138376 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode179d9f8_0775_4bde_9ac9_f5a4f6919fd6.slice/crio-5815a6d36d69a2209375bea643fadce77792eface8bf7ea1c4a47338f6cf2f29 WatchSource:0}: Error finding container 5815a6d36d69a2209375bea643fadce77792eface8bf7ea1c4a47338f6cf2f29: Status 404 returned error can't find the container with id 5815a6d36d69a2209375bea643fadce77792eface8bf7ea1c4a47338f6cf2f29 Mar 13 14:23:00 crc kubenswrapper[4898]: I0313 14:23:00.195792 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 14:23:00 crc kubenswrapper[4898]: I0313 14:23:00.197763 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 14:23:00 crc kubenswrapper[4898]: I0313 14:23:00.727518 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerStarted","Data":"5815a6d36d69a2209375bea643fadce77792eface8bf7ea1c4a47338f6cf2f29"} Mar 13 14:23:00 crc kubenswrapper[4898]: I0313 14:23:00.729524 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9796fb40-37f0-4d8a-929f-4bb6295388a4","Type":"ContainerStarted","Data":"a9557891418a97a02368405ccaf040fbd8afdac6c0e21aa1dbdad6d66c32db14"} Mar 13 14:23:00 crc kubenswrapper[4898]: I0313 14:23:00.729600 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9796fb40-37f0-4d8a-929f-4bb6295388a4","Type":"ContainerStarted","Data":"ee07bf8d012fcf6aefa9d877ab61e51a9ab17008f8e280eddbbc403dcc107dba"} Mar 13 14:23:00 crc kubenswrapper[4898]: I0313 14:23:00.730109 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sgxj9" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerName="registry-server" containerID="cri-o://6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38" gracePeriod=2 Mar 13 14:23:00 crc kubenswrapper[4898]: I0313 14:23:00.771331 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.771314816 podStartE2EDuration="2.771314816s" podCreationTimestamp="2026-03-13 14:22:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:00.759741915 +0000 UTC m=+1615.761330164" watchObservedRunningTime="2026-03-13 14:23:00.771314816 +0000 UTC m=+1615.772903045" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.332225 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.461287 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t8zq\" (UniqueName: \"kubernetes.io/projected/df5a6baa-ea65-4b79-b73b-2e1707695c41-kube-api-access-8t8zq\") pod \"df5a6baa-ea65-4b79-b73b-2e1707695c41\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.461703 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-utilities\") pod \"df5a6baa-ea65-4b79-b73b-2e1707695c41\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.461753 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-catalog-content\") pod \"df5a6baa-ea65-4b79-b73b-2e1707695c41\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.463016 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-utilities" (OuterVolumeSpecName: "utilities") pod "df5a6baa-ea65-4b79-b73b-2e1707695c41" (UID: "df5a6baa-ea65-4b79-b73b-2e1707695c41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.491267 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df5a6baa-ea65-4b79-b73b-2e1707695c41-kube-api-access-8t8zq" (OuterVolumeSpecName: "kube-api-access-8t8zq") pod "df5a6baa-ea65-4b79-b73b-2e1707695c41" (UID: "df5a6baa-ea65-4b79-b73b-2e1707695c41"). InnerVolumeSpecName "kube-api-access-8t8zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.513726 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df5a6baa-ea65-4b79-b73b-2e1707695c41" (UID: "df5a6baa-ea65-4b79-b73b-2e1707695c41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.564582 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t8zq\" (UniqueName: \"kubernetes.io/projected/df5a6baa-ea65-4b79-b73b-2e1707695c41-kube-api-access-8t8zq\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.564628 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.564642 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.759595 4898 generic.go:334] "Generic (PLEG): container finished" podID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerID="6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38" exitCode=0 Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.760636 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.773114 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.773145 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerStarted","Data":"4bb36715efc02b13f9a2428d735fb39dbbe5a666eac1302af323de3923739e47"} Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.773163 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgxj9" event={"ID":"df5a6baa-ea65-4b79-b73b-2e1707695c41","Type":"ContainerDied","Data":"6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38"} Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.773182 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgxj9" event={"ID":"df5a6baa-ea65-4b79-b73b-2e1707695c41","Type":"ContainerDied","Data":"c4f29c1c84db31109e4a65fbd939345ced5fa3e54cbf69234fc3c9374e91653f"} Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.773199 4898 scope.go:117] "RemoveContainer" containerID="6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.835255 4898 scope.go:117] "RemoveContainer" containerID="dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.841489 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sgxj9"] Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.851841 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sgxj9"] Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.870512 4898 scope.go:117] "RemoveContainer" containerID="d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.890365 4898 scope.go:117] "RemoveContainer" containerID="6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38" Mar 13 14:23:01 crc kubenswrapper[4898]: E0313 14:23:01.891937 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38\": container with ID starting with 6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38 not found: ID does not exist" containerID="6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.891972 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38"} err="failed to get container status \"6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38\": rpc error: code = NotFound desc = could not find container \"6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38\": container with ID starting with 6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38 not found: ID does not exist" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.891997 4898 scope.go:117] "RemoveContainer" containerID="dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad" Mar 13 14:23:01 crc kubenswrapper[4898]: E0313 14:23:01.892335 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad\": container with ID starting with dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad not found: ID does not exist" containerID="dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.892357 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad"} err="failed to get container status \"dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad\": rpc error: code = NotFound desc = could not find container \"dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad\": container with ID starting with dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad not found: ID does not exist" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.892370 4898 scope.go:117] "RemoveContainer" containerID="d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606" Mar 13 14:23:01 crc kubenswrapper[4898]: E0313 14:23:01.892635 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606\": container with ID starting with d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606 not found: ID does not exist" containerID="d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.892656 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606"} err="failed to get container status \"d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606\": rpc error: code = NotFound desc = could not find container \"d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606\": container with ID starting with d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606 not found: ID does not exist" Mar 13 14:23:02 crc kubenswrapper[4898]: I0313 14:23:02.785989 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerStarted","Data":"78de73e116b8c904046e3b2e47974a2196e68f801e439ea9db14e62605f5dfde"} Mar 13 14:23:02 crc kubenswrapper[4898]: I0313 14:23:02.786279 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerStarted","Data":"e7eb04ad73d23fba07486739b6ad6f87229d6c54e9915ba0f9d02034ddc99c49"} Mar 13 14:23:03 crc kubenswrapper[4898]: I0313 14:23:03.775790 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" path="/var/lib/kubelet/pods/df5a6baa-ea65-4b79-b73b-2e1707695c41/volumes" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.184869 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.847699 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-76lrv"] Mar 13 14:23:04 crc kubenswrapper[4898]: E0313 14:23:04.849086 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerName="extract-utilities" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.849111 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerName="extract-utilities" Mar 13 14:23:04 crc kubenswrapper[4898]: E0313 14:23:04.849165 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerName="extract-content" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.849174 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerName="extract-content" Mar 13 14:23:04 crc kubenswrapper[4898]: E0313 14:23:04.849207 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerName="registry-server" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.849215 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerName="registry-server" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.849512 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerName="registry-server" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.850559 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.853938 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.860801 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.866829 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-76lrv"] Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.962958 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-config-data\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.963084 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-scripts\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.963130 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.963257 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92r4m\" (UniqueName: \"kubernetes.io/projected/04183e35-79b0-4c76-b538-b5b71299cd92-kube-api-access-92r4m\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.029775 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.037176 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.046375 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.067139 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92r4m\" (UniqueName: \"kubernetes.io/projected/04183e35-79b0-4c76-b538-b5b71299cd92-kube-api-access-92r4m\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.069480 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-config-data\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.069778 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-scripts\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.075749 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.081571 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.078752 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.127844 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92r4m\" (UniqueName: \"kubernetes.io/projected/04183e35-79b0-4c76-b538-b5b71299cd92-kube-api-access-92r4m\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.147639 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-config-data\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.161470 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-scripts\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.174518 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.184218 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.192102 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.200245 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.201011 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4l9h\" (UniqueName: \"kubernetes.io/projected/5edbf12d-a655-4822-98da-9719c131fa14-kube-api-access-p4l9h\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.201064 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.211854 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.414603 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-config-data\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.414744 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.415055 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4l9h\" (UniqueName: \"kubernetes.io/projected/5edbf12d-a655-4822-98da-9719c131fa14-kube-api-access-p4l9h\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.415096 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb29588-10df-4b49-a6a6-6a83ddada750-logs\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.415135 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.415267 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd5pp\" (UniqueName: \"kubernetes.io/projected/3fb29588-10df-4b49-a6a6-6a83ddada750-kube-api-access-dd5pp\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.415750 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.430667 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.448815 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.450808 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.460839 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.480069 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.488523 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4l9h\" (UniqueName: \"kubernetes.io/projected/5edbf12d-a655-4822-98da-9719c131fa14-kube-api-access-p4l9h\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.526030 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-config-data\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.526271 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-config-data\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.526358 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.526487 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb29588-10df-4b49-a6a6-6a83ddada750-logs\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.526630 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2dlp\" (UniqueName: \"kubernetes.io/projected/0769b03d-29b4-4519-abc7-408431328276-kube-api-access-k2dlp\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.526741 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd5pp\" (UniqueName: \"kubernetes.io/projected/3fb29588-10df-4b49-a6a6-6a83ddada750-kube-api-access-dd5pp\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.526874 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.527020 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb29588-10df-4b49-a6a6-6a83ddada750-logs\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.533729 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-config-data\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.540087 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.545661 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.553249 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd5pp\" (UniqueName: \"kubernetes.io/projected/3fb29588-10df-4b49-a6a6-6a83ddada750-kube-api-access-dd5pp\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.579802 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.616329 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4rc67"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.621792 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.629974 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4rc67"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.630759 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.631777 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2dlp\" (UniqueName: \"kubernetes.io/projected/0769b03d-29b4-4519-abc7-408431328276-kube-api-access-k2dlp\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.632207 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.632365 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-config-data\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.637238 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.660791 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-config-data\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.663284 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2dlp\" (UniqueName: \"kubernetes.io/projected/0769b03d-29b4-4519-abc7-408431328276-kube-api-access-k2dlp\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.668144 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.670480 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.681124 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.689432 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.724622 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.737143 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-svc\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.740070 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.740331 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.740406 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqwkf\" (UniqueName: \"kubernetes.io/projected/ee82d4ec-b565-40b8-b878-2574487d7e9d-kube-api-access-rqwkf\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.740592 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.740806 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-config\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.801825 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.845916 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-svc\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846421 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846501 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-config-data\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846528 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846567 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/401e6738-93d7-40d4-867e-8c68437cbad3-logs\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846631 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846655 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqwkf\" (UniqueName: \"kubernetes.io/projected/ee82d4ec-b565-40b8-b878-2574487d7e9d-kube-api-access-rqwkf\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846744 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846809 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z75l\" (UniqueName: \"kubernetes.io/projected/401e6738-93d7-40d4-867e-8c68437cbad3-kube-api-access-2z75l\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846881 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-config\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.853662 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-svc\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.854196 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.856826 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.856859 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.857380 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-config\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.897163 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqwkf\" (UniqueName: \"kubernetes.io/projected/ee82d4ec-b565-40b8-b878-2574487d7e9d-kube-api-access-rqwkf\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.905160 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerStarted","Data":"e1d215889c17e15d0414497f1d056c6cbb6a9ad029cf99e79cd2f78d36843f9a"} Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.905773 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.963635 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z75l\" (UniqueName: \"kubernetes.io/projected/401e6738-93d7-40d4-867e-8c68437cbad3-kube-api-access-2z75l\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.964742 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.964961 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-config-data\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.965855 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/401e6738-93d7-40d4-867e-8c68437cbad3-logs\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.966563 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/401e6738-93d7-40d4-867e-8c68437cbad3-logs\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.973832 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.195543427 podStartE2EDuration="6.97381093s" podCreationTimestamp="2026-03-13 14:22:59 +0000 UTC" firstStartedPulling="2026-03-13 14:23:00.141377112 +0000 UTC m=+1615.142965351" lastFinishedPulling="2026-03-13 14:23:04.919644615 +0000 UTC m=+1619.921232854" observedRunningTime="2026-03-13 14:23:05.958876653 +0000 UTC m=+1620.960464912" watchObservedRunningTime="2026-03-13 14:23:05.97381093 +0000 UTC m=+1620.975399169" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.977357 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-config-data\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.980189 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.982833 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.000777 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z75l\" (UniqueName: \"kubernetes.io/projected/401e6738-93d7-40d4-867e-8c68437cbad3-kube-api-access-2z75l\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.024853 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.153305 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-76lrv"] Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.495331 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.743094 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.752408 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:06 crc kubenswrapper[4898]: E0313 14:23:06.760437 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.782354 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.824504 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-llbn5"] Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.828435 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.843405 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-llbn5"] Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.843790 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.843936 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.928950 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fb29588-10df-4b49-a6a6-6a83ddada750","Type":"ContainerStarted","Data":"fbba396a6b51b627bcb4364e72eecb72dbdb1d0ee7df91ec4ad791d87ce837cc"} Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.933565 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-76lrv" event={"ID":"04183e35-79b0-4c76-b538-b5b71299cd92","Type":"ContainerStarted","Data":"74e75aa197a91664f89edd48f01ff5c813660e7743cf922315f4b6a18e19c506"} Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.934937 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-76lrv" event={"ID":"04183e35-79b0-4c76-b538-b5b71299cd92","Type":"ContainerStarted","Data":"8a577eea2ae5147af3f287388f18da3286a949f40aa4ca2fbd11e7f3d483f618"} Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.937864 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0769b03d-29b4-4519-abc7-408431328276","Type":"ContainerStarted","Data":"8afa6be1221f1010388ebd2f7d552a490a868c166c3d8a6eee8fcc24b9095e1d"} Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.940443 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5edbf12d-a655-4822-98da-9719c131fa14","Type":"ContainerStarted","Data":"7a4153aaa3cfe90175a3e8e41cb9a73f5056ef4defe81a11d1b496b7bcdcc9b6"} Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.978885 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-76lrv" podStartSLOduration=2.978854781 podStartE2EDuration="2.978854781s" podCreationTimestamp="2026-03-13 14:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:06.95953035 +0000 UTC m=+1621.961118619" watchObservedRunningTime="2026-03-13 14:23:06.978854781 +0000 UTC m=+1621.980443020" Mar 13 14:23:07 crc kubenswrapper[4898]: W0313 14:23:07.016688 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee82d4ec_b565_40b8_b878_2574487d7e9d.slice/crio-7a6f36f785b386fbe1a7acf818da5c2bcd85dcd2bb3cbfac1d54dd77d4a14425 WatchSource:0}: Error finding container 7a6f36f785b386fbe1a7acf818da5c2bcd85dcd2bb3cbfac1d54dd77d4a14425: Status 404 returned error can't find the container with id 7a6f36f785b386fbe1a7acf818da5c2bcd85dcd2bb3cbfac1d54dd77d4a14425 Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.025475 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfnmd\" (UniqueName: \"kubernetes.io/projected/bff908e4-09f4-490b-9b9c-ef65c6224eeb-kube-api-access-nfnmd\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.025662 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-config-data\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.025725 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-scripts\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.026058 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: W0313 14:23:07.060875 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod401e6738_93d7_40d4_867e_8c68437cbad3.slice/crio-33c80887acf46993b60950d3ec27771c17d7d31947de5c0902c67049c9927696 WatchSource:0}: Error finding container 33c80887acf46993b60950d3ec27771c17d7d31947de5c0902c67049c9927696: Status 404 returned error can't find the container with id 33c80887acf46993b60950d3ec27771c17d7d31947de5c0902c67049c9927696 Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.074536 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4rc67"] Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.092547 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.133416 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-scripts\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.136264 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.139956 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-scripts\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.140225 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.140687 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfnmd\" (UniqueName: \"kubernetes.io/projected/bff908e4-09f4-490b-9b9c-ef65c6224eeb-kube-api-access-nfnmd\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.140948 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-config-data\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.149231 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-config-data\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.164474 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfnmd\" (UniqueName: \"kubernetes.io/projected/bff908e4-09f4-490b-9b9c-ef65c6224eeb-kube-api-access-nfnmd\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.176345 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.732264 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-llbn5"] Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.970936 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-llbn5" event={"ID":"bff908e4-09f4-490b-9b9c-ef65c6224eeb","Type":"ContainerStarted","Data":"bf25202e858a5aee252a5d091c6efe401cd4b61d92f156b22eb7c6a08833446f"} Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.972785 4898 generic.go:334] "Generic (PLEG): container finished" podID="ee82d4ec-b565-40b8-b878-2574487d7e9d" containerID="ceacbe8778fcc11e62876d98d598259e725fa8302adf85af1d1ddc9df4d62ff6" exitCode=0 Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.972958 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" event={"ID":"ee82d4ec-b565-40b8-b878-2574487d7e9d","Type":"ContainerDied","Data":"ceacbe8778fcc11e62876d98d598259e725fa8302adf85af1d1ddc9df4d62ff6"} Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.973047 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" event={"ID":"ee82d4ec-b565-40b8-b878-2574487d7e9d","Type":"ContainerStarted","Data":"7a6f36f785b386fbe1a7acf818da5c2bcd85dcd2bb3cbfac1d54dd77d4a14425"} Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.982032 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401e6738-93d7-40d4-867e-8c68437cbad3","Type":"ContainerStarted","Data":"33c80887acf46993b60950d3ec27771c17d7d31947de5c0902c67049c9927696"} Mar 13 14:23:09 crc kubenswrapper[4898]: I0313 14:23:09.010505 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-llbn5" event={"ID":"bff908e4-09f4-490b-9b9c-ef65c6224eeb","Type":"ContainerStarted","Data":"7e9f2307a91699c726a3f93d044663fb844450acd6da5dd38c51549451b97bc8"} Mar 13 14:23:09 crc kubenswrapper[4898]: I0313 14:23:09.014917 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" event={"ID":"ee82d4ec-b565-40b8-b878-2574487d7e9d","Type":"ContainerStarted","Data":"7d83563b664dd524f060763bd5deadd7009b47fedbc88f53844517f4c00a64ea"} Mar 13 14:23:09 crc kubenswrapper[4898]: I0313 14:23:09.015351 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:09 crc kubenswrapper[4898]: I0313 14:23:09.050770 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-llbn5" podStartSLOduration=3.050742576 podStartE2EDuration="3.050742576s" podCreationTimestamp="2026-03-13 14:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:09.032934234 +0000 UTC m=+1624.034522483" watchObservedRunningTime="2026-03-13 14:23:09.050742576 +0000 UTC m=+1624.052330825" Mar 13 14:23:09 crc kubenswrapper[4898]: I0313 14:23:09.085233 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" podStartSLOduration=4.085208771 podStartE2EDuration="4.085208771s" podCreationTimestamp="2026-03-13 14:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:09.058865767 +0000 UTC m=+1624.060454026" watchObservedRunningTime="2026-03-13 14:23:09.085208771 +0000 UTC m=+1624.086797010" Mar 13 14:23:09 crc kubenswrapper[4898]: I0313 14:23:09.128549 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:09 crc kubenswrapper[4898]: I0313 14:23:09.152731 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.111571 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fb29588-10df-4b49-a6a6-6a83ddada750","Type":"ContainerStarted","Data":"b97169c17bbe4e26153e2ab8b910eb60fbafe00f2669ecd0f29ce4eb8dba08e0"} Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.112293 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fb29588-10df-4b49-a6a6-6a83ddada750","Type":"ContainerStarted","Data":"4d8bf97f6df1c8578abf9d8b2ff9a16d6f36d0a198628e241eb7ec672c4d77c5"} Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.112452 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerName="nova-metadata-log" containerID="cri-o://4d8bf97f6df1c8578abf9d8b2ff9a16d6f36d0a198628e241eb7ec672c4d77c5" gracePeriod=30 Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.113027 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerName="nova-metadata-metadata" containerID="cri-o://b97169c17bbe4e26153e2ab8b910eb60fbafe00f2669ecd0f29ce4eb8dba08e0" gracePeriod=30 Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.121638 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0769b03d-29b4-4519-abc7-408431328276","Type":"ContainerStarted","Data":"a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b"} Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.127265 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5edbf12d-a655-4822-98da-9719c131fa14","Type":"ContainerStarted","Data":"132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430"} Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.127323 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="5edbf12d-a655-4822-98da-9719c131fa14" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430" gracePeriod=30 Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.129098 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401e6738-93d7-40d4-867e-8c68437cbad3","Type":"ContainerStarted","Data":"6a3356dec1914d7176270b5cf94ad8c6d3be50dc70db5f963643ba9ceaa22838"} Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.129353 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401e6738-93d7-40d4-867e-8c68437cbad3","Type":"ContainerStarted","Data":"4172774e3d35f2447bfbdab8c2932d0b6b6936e1f5596cbc2f41a9105e3cedbe"} Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.164216 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.972854795 podStartE2EDuration="8.164195859s" podCreationTimestamp="2026-03-13 14:23:05 +0000 UTC" firstStartedPulling="2026-03-13 14:23:06.769016824 +0000 UTC m=+1621.770605063" lastFinishedPulling="2026-03-13 14:23:11.960357888 +0000 UTC m=+1626.961946127" observedRunningTime="2026-03-13 14:23:13.142525517 +0000 UTC m=+1628.144113766" watchObservedRunningTime="2026-03-13 14:23:13.164195859 +0000 UTC m=+1628.165784098" Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.175488 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.741658133 podStartE2EDuration="9.175464382s" podCreationTimestamp="2026-03-13 14:23:04 +0000 UTC" firstStartedPulling="2026-03-13 14:23:06.515592865 +0000 UTC m=+1621.517181104" lastFinishedPulling="2026-03-13 14:23:11.949399114 +0000 UTC m=+1626.950987353" observedRunningTime="2026-03-13 14:23:13.165838842 +0000 UTC m=+1628.167427091" watchObservedRunningTime="2026-03-13 14:23:13.175464382 +0000 UTC m=+1628.177052621" Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.195353 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.003821889 podStartE2EDuration="8.195335258s" podCreationTimestamp="2026-03-13 14:23:05 +0000 UTC" firstStartedPulling="2026-03-13 14:23:06.757227278 +0000 UTC m=+1621.758815517" lastFinishedPulling="2026-03-13 14:23:11.948740647 +0000 UTC m=+1626.950328886" observedRunningTime="2026-03-13 14:23:13.192153365 +0000 UTC m=+1628.193741634" watchObservedRunningTime="2026-03-13 14:23:13.195335258 +0000 UTC m=+1628.196923497" Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.248936 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.367682385 podStartE2EDuration="8.248912339s" podCreationTimestamp="2026-03-13 14:23:05 +0000 UTC" firstStartedPulling="2026-03-13 14:23:07.070136501 +0000 UTC m=+1622.071724740" lastFinishedPulling="2026-03-13 14:23:11.951366455 +0000 UTC m=+1626.952954694" observedRunningTime="2026-03-13 14:23:13.209110435 +0000 UTC m=+1628.210698684" watchObservedRunningTime="2026-03-13 14:23:13.248912339 +0000 UTC m=+1628.250500578" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.145066 4898 generic.go:334] "Generic (PLEG): container finished" podID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerID="b97169c17bbe4e26153e2ab8b910eb60fbafe00f2669ecd0f29ce4eb8dba08e0" exitCode=0 Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.145396 4898 generic.go:334] "Generic (PLEG): container finished" podID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerID="4d8bf97f6df1c8578abf9d8b2ff9a16d6f36d0a198628e241eb7ec672c4d77c5" exitCode=143 Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.145672 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fb29588-10df-4b49-a6a6-6a83ddada750","Type":"ContainerDied","Data":"b97169c17bbe4e26153e2ab8b910eb60fbafe00f2669ecd0f29ce4eb8dba08e0"} Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.145728 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fb29588-10df-4b49-a6a6-6a83ddada750","Type":"ContainerDied","Data":"4d8bf97f6df1c8578abf9d8b2ff9a16d6f36d0a198628e241eb7ec672c4d77c5"} Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.145766 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fb29588-10df-4b49-a6a6-6a83ddada750","Type":"ContainerDied","Data":"fbba396a6b51b627bcb4364e72eecb72dbdb1d0ee7df91ec4ad791d87ce837cc"} Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.145779 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbba396a6b51b627bcb4364e72eecb72dbdb1d0ee7df91ec4ad791d87ce837cc" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.204835 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.243327 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd5pp\" (UniqueName: \"kubernetes.io/projected/3fb29588-10df-4b49-a6a6-6a83ddada750-kube-api-access-dd5pp\") pod \"3fb29588-10df-4b49-a6a6-6a83ddada750\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.243433 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-combined-ca-bundle\") pod \"3fb29588-10df-4b49-a6a6-6a83ddada750\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.243461 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-config-data\") pod \"3fb29588-10df-4b49-a6a6-6a83ddada750\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.243503 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb29588-10df-4b49-a6a6-6a83ddada750-logs\") pod \"3fb29588-10df-4b49-a6a6-6a83ddada750\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.250973 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fb29588-10df-4b49-a6a6-6a83ddada750-logs" (OuterVolumeSpecName: "logs") pod "3fb29588-10df-4b49-a6a6-6a83ddada750" (UID: "3fb29588-10df-4b49-a6a6-6a83ddada750"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.270141 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fb29588-10df-4b49-a6a6-6a83ddada750-kube-api-access-dd5pp" (OuterVolumeSpecName: "kube-api-access-dd5pp") pod "3fb29588-10df-4b49-a6a6-6a83ddada750" (UID: "3fb29588-10df-4b49-a6a6-6a83ddada750"). InnerVolumeSpecName "kube-api-access-dd5pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.309422 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-config-data" (OuterVolumeSpecName: "config-data") pod "3fb29588-10df-4b49-a6a6-6a83ddada750" (UID: "3fb29588-10df-4b49-a6a6-6a83ddada750"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.309444 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fb29588-10df-4b49-a6a6-6a83ddada750" (UID: "3fb29588-10df-4b49-a6a6-6a83ddada750"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.347315 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd5pp\" (UniqueName: \"kubernetes.io/projected/3fb29588-10df-4b49-a6a6-6a83ddada750-kube-api-access-dd5pp\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.347363 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.347375 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.347388 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb29588-10df-4b49-a6a6-6a83ddada750-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.020916 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.021543 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="ceilometer-central-agent" containerID="cri-o://4bb36715efc02b13f9a2428d735fb39dbbe5a666eac1302af323de3923739e47" gracePeriod=30 Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.021661 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="sg-core" containerID="cri-o://78de73e116b8c904046e3b2e47974a2196e68f801e439ea9db14e62605f5dfde" gracePeriod=30 Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.021700 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="ceilometer-notification-agent" containerID="cri-o://e7eb04ad73d23fba07486739b6ad6f87229d6c54e9915ba0f9d02034ddc99c49" gracePeriod=30 Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.021661 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="proxy-httpd" containerID="cri-o://e1d215889c17e15d0414497f1d056c6cbb6a9ad029cf99e79cd2f78d36843f9a" gracePeriod=30 Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.034749 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.164888 4898 generic.go:334] "Generic (PLEG): container finished" podID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerID="78de73e116b8c904046e3b2e47974a2196e68f801e439ea9db14e62605f5dfde" exitCode=2 Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.164941 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerDied","Data":"78de73e116b8c904046e3b2e47974a2196e68f801e439ea9db14e62605f5dfde"} Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.165008 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.212426 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.230055 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.310367 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:15 crc kubenswrapper[4898]: E0313 14:23:15.314720 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerName="nova-metadata-metadata" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.314751 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerName="nova-metadata-metadata" Mar 13 14:23:15 crc kubenswrapper[4898]: E0313 14:23:15.314805 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerName="nova-metadata-log" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.314812 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerName="nova-metadata-log" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.320234 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerName="nova-metadata-log" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.320295 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerName="nova-metadata-metadata" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.333783 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.341372 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.342704 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.358908 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.477413 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31df8cc-85c8-4626-ab54-1a93d291f02d-logs\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.477814 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.477876 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvb8h\" (UniqueName: \"kubernetes.io/projected/f31df8cc-85c8-4626-ab54-1a93d291f02d-kube-api-access-pvb8h\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.477927 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-config-data\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.478090 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.581593 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.581914 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31df8cc-85c8-4626-ab54-1a93d291f02d-logs\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.581989 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.582055 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvb8h\" (UniqueName: \"kubernetes.io/projected/f31df8cc-85c8-4626-ab54-1a93d291f02d-kube-api-access-pvb8h\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.582087 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-config-data\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.582307 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31df8cc-85c8-4626-ab54-1a93d291f02d-logs\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.589836 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.590800 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.591479 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-config-data\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.603605 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvb8h\" (UniqueName: \"kubernetes.io/projected/f31df8cc-85c8-4626-ab54-1a93d291f02d-kube-api-access-pvb8h\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.634975 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.672027 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.200:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.672370 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.200:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.694800 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.785341 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" path="/var/lib/kubelet/pods/3fb29588-10df-4b49-a6a6-6a83ddada750/volumes" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.807681 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.808115 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.865111 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.988187 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.026164 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.026200 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.090492 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xntfr"] Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.090772 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" podUID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" containerName="dnsmasq-dns" containerID="cri-o://12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a" gracePeriod=10 Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.256455 4898 generic.go:334] "Generic (PLEG): container finished" podID="e53d1b61-e0c8-4c10-85bf-1c0f67009a24" containerID="c6cbd243a4ab0ae3ee88d2b14b07e0b9c8fda594949edb73fc92248b8f25ddf9" exitCode=137 Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.256550 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" event={"ID":"e53d1b61-e0c8-4c10-85bf-1c0f67009a24","Type":"ContainerDied","Data":"c6cbd243a4ab0ae3ee88d2b14b07e0b9c8fda594949edb73fc92248b8f25ddf9"} Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.256575 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" event={"ID":"e53d1b61-e0c8-4c10-85bf-1c0f67009a24","Type":"ContainerDied","Data":"4e540b110f512e47246a7e1e8d332b0a3a04ba375a44ae46bbb913a91b51ab5a"} Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.256599 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e540b110f512e47246a7e1e8d332b0a3a04ba375a44ae46bbb913a91b51ab5a" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.278167 4898 generic.go:334] "Generic (PLEG): container finished" podID="ad3d61d7-d777-4115-92c7-e4e3125c5260" containerID="954199ed87793fe013823cc99558f408a84d0a4c4745073ecc940b8eedc022cd" exitCode=137 Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.278247 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9b6c99f6d-7zgm5" event={"ID":"ad3d61d7-d777-4115-92c7-e4e3125c5260","Type":"ContainerDied","Data":"954199ed87793fe013823cc99558f408a84d0a4c4745073ecc940b8eedc022cd"} Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.317399 4898 generic.go:334] "Generic (PLEG): container finished" podID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerID="e1d215889c17e15d0414497f1d056c6cbb6a9ad029cf99e79cd2f78d36843f9a" exitCode=0 Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.317425 4898 generic.go:334] "Generic (PLEG): container finished" podID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerID="e7eb04ad73d23fba07486739b6ad6f87229d6c54e9915ba0f9d02034ddc99c49" exitCode=0 Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.317433 4898 generic.go:334] "Generic (PLEG): container finished" podID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerID="4bb36715efc02b13f9a2428d735fb39dbbe5a666eac1302af323de3923739e47" exitCode=0 Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.317824 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerDied","Data":"e1d215889c17e15d0414497f1d056c6cbb6a9ad029cf99e79cd2f78d36843f9a"} Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.317849 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerDied","Data":"e7eb04ad73d23fba07486739b6ad6f87229d6c54e9915ba0f9d02034ddc99c49"} Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.317859 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerDied","Data":"4bb36715efc02b13f9a2428d735fb39dbbe5a666eac1302af323de3923739e47"} Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.374160 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.402267 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.539726 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-combined-ca-bundle\") pod \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.539842 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data-custom\") pod \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.539862 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data\") pod \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.539981 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg2l6\" (UniqueName: \"kubernetes.io/projected/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-kube-api-access-cg2l6\") pod \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.547082 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e53d1b61-e0c8-4c10-85bf-1c0f67009a24" (UID: "e53d1b61-e0c8-4c10-85bf-1c0f67009a24"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.547267 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-kube-api-access-cg2l6" (OuterVolumeSpecName: "kube-api-access-cg2l6") pod "e53d1b61-e0c8-4c10-85bf-1c0f67009a24" (UID: "e53d1b61-e0c8-4c10-85bf-1c0f67009a24"). InnerVolumeSpecName "kube-api-access-cg2l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.612215 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e53d1b61-e0c8-4c10-85bf-1c0f67009a24" (UID: "e53d1b61-e0c8-4c10-85bf-1c0f67009a24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.628645 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data" (OuterVolumeSpecName: "config-data") pod "e53d1b61-e0c8-4c10-85bf-1c0f67009a24" (UID: "e53d1b61-e0c8-4c10-85bf-1c0f67009a24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.643326 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.643360 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.643372 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.643382 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg2l6\" (UniqueName: \"kubernetes.io/projected/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-kube-api-access-cg2l6\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.693857 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.849806 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data-custom\") pod \"ad3d61d7-d777-4115-92c7-e4e3125c5260\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.849994 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjv6n\" (UniqueName: \"kubernetes.io/projected/ad3d61d7-d777-4115-92c7-e4e3125c5260-kube-api-access-bjv6n\") pod \"ad3d61d7-d777-4115-92c7-e4e3125c5260\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.850054 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-combined-ca-bundle\") pod \"ad3d61d7-d777-4115-92c7-e4e3125c5260\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.850113 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data\") pod \"ad3d61d7-d777-4115-92c7-e4e3125c5260\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.856783 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad3d61d7-d777-4115-92c7-e4e3125c5260-kube-api-access-bjv6n" (OuterVolumeSpecName: "kube-api-access-bjv6n") pod "ad3d61d7-d777-4115-92c7-e4e3125c5260" (UID: "ad3d61d7-d777-4115-92c7-e4e3125c5260"). InnerVolumeSpecName "kube-api-access-bjv6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.857370 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ad3d61d7-d777-4115-92c7-e4e3125c5260" (UID: "ad3d61d7-d777-4115-92c7-e4e3125c5260"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.928108 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad3d61d7-d777-4115-92c7-e4e3125c5260" (UID: "ad3d61d7-d777-4115-92c7-e4e3125c5260"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.952937 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.952976 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjv6n\" (UniqueName: \"kubernetes.io/projected/ad3d61d7-d777-4115-92c7-e4e3125c5260-kube-api-access-bjv6n\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.952991 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.007157 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.033914 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data" (OuterVolumeSpecName: "config-data") pod "ad3d61d7-d777-4115-92c7-e4e3125c5260" (UID: "ad3d61d7-d777-4115-92c7-e4e3125c5260"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.055570 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.110051 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.253:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.110288 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.253:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.159297 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.271330 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd2mk\" (UniqueName: \"kubernetes.io/projected/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-kube-api-access-zd2mk\") pod \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.271382 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-config\") pod \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.271495 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-sb\") pod \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.271546 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-svc\") pod \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.271589 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-nb\") pod \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.271624 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-swift-storage-0\") pod \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.286105 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-kube-api-access-zd2mk" (OuterVolumeSpecName: "kube-api-access-zd2mk") pod "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" (UID: "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6"). InnerVolumeSpecName "kube-api-access-zd2mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.292970 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.389611 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f31df8cc-85c8-4626-ab54-1a93d291f02d","Type":"ContainerStarted","Data":"113e1d3b19705c253801494357ca106e72aa1f2de77f8992627258135742aa53"} Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.391688 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9b6c99f6d-7zgm5" event={"ID":"ad3d61d7-d777-4115-92c7-e4e3125c5260","Type":"ContainerDied","Data":"9c7395afa41324a0f82874b3c28b6ce2289ed61f6812ec39a8b7176eb1dd6a99"} Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.391754 4898 scope.go:117] "RemoveContainer" containerID="954199ed87793fe013823cc99558f408a84d0a4c4745073ecc940b8eedc022cd" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.391943 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.404242 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd2mk\" (UniqueName: \"kubernetes.io/projected/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-kube-api-access-zd2mk\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.471527 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerDied","Data":"5815a6d36d69a2209375bea643fadce77792eface8bf7ea1c4a47338f6cf2f29"} Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.471682 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.493468 4898 generic.go:334] "Generic (PLEG): container finished" podID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" containerID="12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a" exitCode=0 Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.494617 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.495359 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" event={"ID":"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6","Type":"ContainerDied","Data":"12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a"} Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.495391 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" event={"ID":"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6","Type":"ContainerDied","Data":"c39986f2a0c08da0dd84aef4031cbde75ea3fecd89eca753618403b386b96b49"} Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.495457 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.510957 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-combined-ca-bundle\") pod \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.511083 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-log-httpd\") pod \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.511144 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-sg-core-conf-yaml\") pod \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.511593 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" (UID: "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.513128 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-run-httpd\") pod \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.513163 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-scripts\") pod \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.513187 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsfjl\" (UniqueName: \"kubernetes.io/projected/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-kube-api-access-bsfjl\") pod \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.513262 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-config-data\") pod \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.513646 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.514213 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" (UID: "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.585555 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-scripts" (OuterVolumeSpecName: "scripts") pod "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" (UID: "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.618639 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.618683 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.619370 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-kube-api-access-bsfjl" (OuterVolumeSpecName: "kube-api-access-bsfjl") pod "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" (UID: "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6"). InnerVolumeSpecName "kube-api-access-bsfjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.726473 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsfjl\" (UniqueName: \"kubernetes.io/projected/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-kube-api-access-bsfjl\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.774569 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" (UID: "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.793104 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" (UID: "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.818918 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" (UID: "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.831858 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.831884 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.832035 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.856090 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" (UID: "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.875297 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-config" (OuterVolumeSpecName: "config") pod "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" (UID: "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.939187 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.939229 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.940559 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" (UID: "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.001820 4898 scope.go:117] "RemoveContainer" containerID="e1d215889c17e15d0414497f1d056c6cbb6a9ad029cf99e79cd2f78d36843f9a" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.021052 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" (UID: "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.031999 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-9b6c99f6d-7zgm5"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.046092 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.046123 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.049090 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-9b6c99f6d-7zgm5"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.058072 4898 scope.go:117] "RemoveContainer" containerID="78de73e116b8c904046e3b2e47974a2196e68f801e439ea9db14e62605f5dfde" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.071000 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-759d64ffd4-kzp67"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.079834 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-759d64ffd4-kzp67"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.088091 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-config-data" (OuterVolumeSpecName: "config-data") pod "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" (UID: "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.124053 4898 scope.go:117] "RemoveContainer" containerID="e7eb04ad73d23fba07486739b6ad6f87229d6c54e9915ba0f9d02034ddc99c49" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.147881 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.155357 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xntfr"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.165878 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xntfr"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.175710 4898 scope.go:117] "RemoveContainer" containerID="4bb36715efc02b13f9a2428d735fb39dbbe5a666eac1302af323de3923739e47" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.266650 4898 scope.go:117] "RemoveContainer" containerID="12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.298136 4898 scope.go:117] "RemoveContainer" containerID="1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.366196 4898 scope.go:117] "RemoveContainer" containerID="12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.366753 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a\": container with ID starting with 12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a not found: ID does not exist" containerID="12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.366830 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a"} err="failed to get container status \"12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a\": rpc error: code = NotFound desc = could not find container \"12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a\": container with ID starting with 12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a not found: ID does not exist" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.366865 4898 scope.go:117] "RemoveContainer" containerID="1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.367546 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61\": container with ID starting with 1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61 not found: ID does not exist" containerID="1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.367575 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61"} err="failed to get container status \"1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61\": rpc error: code = NotFound desc = could not find container \"1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61\": container with ID starting with 1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61 not found: ID does not exist" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.447826 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.466585 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.485866 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.486474 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="ceilometer-central-agent" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486490 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="ceilometer-central-agent" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.486512 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e53d1b61-e0c8-4c10-85bf-1c0f67009a24" containerName="heat-cfnapi" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486519 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e53d1b61-e0c8-4c10-85bf-1c0f67009a24" containerName="heat-cfnapi" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.486535 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" containerName="dnsmasq-dns" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486540 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" containerName="dnsmasq-dns" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.486552 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" containerName="init" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486558 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" containerName="init" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.486566 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="proxy-httpd" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486573 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="proxy-httpd" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.486603 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="ceilometer-notification-agent" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486609 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="ceilometer-notification-agent" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.486616 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad3d61d7-d777-4115-92c7-e4e3125c5260" containerName="heat-api" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486622 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3d61d7-d777-4115-92c7-e4e3125c5260" containerName="heat-api" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.486631 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="sg-core" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486638 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="sg-core" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486833 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="ceilometer-notification-agent" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486856 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="proxy-httpd" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486884 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e53d1b61-e0c8-4c10-85bf-1c0f67009a24" containerName="heat-cfnapi" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486916 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="ceilometer-central-agent" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486929 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad3d61d7-d777-4115-92c7-e4e3125c5260" containerName="heat-api" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486944 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" containerName="dnsmasq-dns" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486967 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="sg-core" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.489285 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.493196 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.493451 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.501582 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.534499 4898 generic.go:334] "Generic (PLEG): container finished" podID="bff908e4-09f4-490b-9b9c-ef65c6224eeb" containerID="7e9f2307a91699c726a3f93d044663fb844450acd6da5dd38c51549451b97bc8" exitCode=0 Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.534606 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-llbn5" event={"ID":"bff908e4-09f4-490b-9b9c-ef65c6224eeb","Type":"ContainerDied","Data":"7e9f2307a91699c726a3f93d044663fb844450acd6da5dd38c51549451b97bc8"} Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.554295 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f31df8cc-85c8-4626-ab54-1a93d291f02d","Type":"ContainerStarted","Data":"ab8a9704929696234d2b66d5a8c2c31485aa1055e802133753ca009e70702f0a"} Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.554358 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f31df8cc-85c8-4626-ab54-1a93d291f02d","Type":"ContainerStarted","Data":"ebc50c2130d8c479a6a82fdc2a6a26b1d7c6228aebf42d75caeb8e90cffc4865"} Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.557584 4898 generic.go:334] "Generic (PLEG): container finished" podID="04183e35-79b0-4c76-b538-b5b71299cd92" containerID="74e75aa197a91664f89edd48f01ff5c813660e7743cf922315f4b6a18e19c506" exitCode=0 Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.558141 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-76lrv" event={"ID":"04183e35-79b0-4c76-b538-b5b71299cd92","Type":"ContainerDied","Data":"74e75aa197a91664f89edd48f01ff5c813660e7743cf922315f4b6a18e19c506"} Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.560640 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-log-httpd\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.560733 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pznzq\" (UniqueName: \"kubernetes.io/projected/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-kube-api-access-pznzq\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.560777 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.560927 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-run-httpd\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.560990 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-config-data\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.561050 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.561115 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-scripts\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.578036 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.578013359 podStartE2EDuration="3.578013359s" podCreationTimestamp="2026-03-13 14:23:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:18.57384107 +0000 UTC m=+1633.575429309" watchObservedRunningTime="2026-03-13 14:23:18.578013359 +0000 UTC m=+1633.579601588" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.664528 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-config-data\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.667199 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.667355 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-scripts\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.667639 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-log-httpd\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.667714 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pznzq\" (UniqueName: \"kubernetes.io/projected/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-kube-api-access-pznzq\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.667765 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.668010 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-run-httpd\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.668705 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-run-httpd\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.671862 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-log-httpd\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.672991 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-config-data\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.674282 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-scripts\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.675170 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.688478 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.688758 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pznzq\" (UniqueName: \"kubernetes.io/projected/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-kube-api-access-pznzq\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.808723 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:19 crc kubenswrapper[4898]: I0313 14:23:19.424707 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:19 crc kubenswrapper[4898]: I0313 14:23:19.568609 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerStarted","Data":"1e1a0e67a834a696d965eab6f5a0ade85145365bbc3c3bccceb0280541a0e5ed"} Mar 13 14:23:19 crc kubenswrapper[4898]: I0313 14:23:19.761555 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" path="/var/lib/kubelet/pods/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6/volumes" Mar 13 14:23:19 crc kubenswrapper[4898]: I0313 14:23:19.762611 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad3d61d7-d777-4115-92c7-e4e3125c5260" path="/var/lib/kubelet/pods/ad3d61d7-d777-4115-92c7-e4e3125c5260/volumes" Mar 13 14:23:19 crc kubenswrapper[4898]: I0313 14:23:19.763254 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" path="/var/lib/kubelet/pods/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6/volumes" Mar 13 14:23:19 crc kubenswrapper[4898]: I0313 14:23:19.764526 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e53d1b61-e0c8-4c10-85bf-1c0f67009a24" path="/var/lib/kubelet/pods/e53d1b61-e0c8-4c10-85bf-1c0f67009a24/volumes" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.159161 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.239399 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-combined-ca-bundle\") pod \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.239663 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfnmd\" (UniqueName: \"kubernetes.io/projected/bff908e4-09f4-490b-9b9c-ef65c6224eeb-kube-api-access-nfnmd\") pod \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.239859 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-scripts\") pod \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.240078 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-config-data\") pod \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.269277 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bff908e4-09f4-490b-9b9c-ef65c6224eeb-kube-api-access-nfnmd" (OuterVolumeSpecName: "kube-api-access-nfnmd") pod "bff908e4-09f4-490b-9b9c-ef65c6224eeb" (UID: "bff908e4-09f4-490b-9b9c-ef65c6224eeb"). InnerVolumeSpecName "kube-api-access-nfnmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.277362 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-scripts" (OuterVolumeSpecName: "scripts") pod "bff908e4-09f4-490b-9b9c-ef65c6224eeb" (UID: "bff908e4-09f4-490b-9b9c-ef65c6224eeb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.343412 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfnmd\" (UniqueName: \"kubernetes.io/projected/bff908e4-09f4-490b-9b9c-ef65c6224eeb-kube-api-access-nfnmd\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.343624 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.395457 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bff908e4-09f4-490b-9b9c-ef65c6224eeb" (UID: "bff908e4-09f4-490b-9b9c-ef65c6224eeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.396219 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-config-data" (OuterVolumeSpecName: "config-data") pod "bff908e4-09f4-490b-9b9c-ef65c6224eeb" (UID: "bff908e4-09f4-490b-9b9c-ef65c6224eeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.445765 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.446778 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.517129 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.549275 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92r4m\" (UniqueName: \"kubernetes.io/projected/04183e35-79b0-4c76-b538-b5b71299cd92-kube-api-access-92r4m\") pod \"04183e35-79b0-4c76-b538-b5b71299cd92\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.549536 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-config-data\") pod \"04183e35-79b0-4c76-b538-b5b71299cd92\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.549712 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-combined-ca-bundle\") pod \"04183e35-79b0-4c76-b538-b5b71299cd92\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.549967 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-scripts\") pod \"04183e35-79b0-4c76-b538-b5b71299cd92\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.554492 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-scripts" (OuterVolumeSpecName: "scripts") pod "04183e35-79b0-4c76-b538-b5b71299cd92" (UID: "04183e35-79b0-4c76-b538-b5b71299cd92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.558190 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04183e35-79b0-4c76-b538-b5b71299cd92-kube-api-access-92r4m" (OuterVolumeSpecName: "kube-api-access-92r4m") pod "04183e35-79b0-4c76-b538-b5b71299cd92" (UID: "04183e35-79b0-4c76-b538-b5b71299cd92"). InnerVolumeSpecName "kube-api-access-92r4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.614082 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-config-data" (OuterVolumeSpecName: "config-data") pod "04183e35-79b0-4c76-b538-b5b71299cd92" (UID: "04183e35-79b0-4c76-b538-b5b71299cd92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.614673 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04183e35-79b0-4c76-b538-b5b71299cd92" (UID: "04183e35-79b0-4c76-b538-b5b71299cd92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.617724 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-76lrv" event={"ID":"04183e35-79b0-4c76-b538-b5b71299cd92","Type":"ContainerDied","Data":"8a577eea2ae5147af3f287388f18da3286a949f40aa4ca2fbd11e7f3d483f618"} Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.617960 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a577eea2ae5147af3f287388f18da3286a949f40aa4ca2fbd11e7f3d483f618" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.618134 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.632970 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-llbn5" event={"ID":"bff908e4-09f4-490b-9b9c-ef65c6224eeb","Type":"ContainerDied","Data":"bf25202e858a5aee252a5d091c6efe401cd4b61d92f156b22eb7c6a08833446f"} Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.633043 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf25202e858a5aee252a5d091c6efe401cd4b61d92f156b22eb7c6a08833446f" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.633153 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.644205 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 14:23:20 crc kubenswrapper[4898]: E0313 14:23:20.644948 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04183e35-79b0-4c76-b538-b5b71299cd92" containerName="nova-manage" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.645039 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="04183e35-79b0-4c76-b538-b5b71299cd92" containerName="nova-manage" Mar 13 14:23:20 crc kubenswrapper[4898]: E0313 14:23:20.645177 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bff908e4-09f4-490b-9b9c-ef65c6224eeb" containerName="nova-cell1-conductor-db-sync" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.645253 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff908e4-09f4-490b-9b9c-ef65c6224eeb" containerName="nova-cell1-conductor-db-sync" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.645558 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="04183e35-79b0-4c76-b538-b5b71299cd92" containerName="nova-manage" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.645640 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bff908e4-09f4-490b-9b9c-ef65c6224eeb" containerName="nova-cell1-conductor-db-sync" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.646545 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.647236 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerStarted","Data":"a52bb77ac792d2e28bb8d969def76aa20c8d38d8dddae8c183a41016f4efeb7f"} Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.654501 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50cbae0e-4bf9-41b0-8c87-b551f782aecf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.654569 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7ppd\" (UniqueName: \"kubernetes.io/projected/50cbae0e-4bf9-41b0-8c87-b551f782aecf-kube-api-access-q7ppd\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.654592 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50cbae0e-4bf9-41b0-8c87-b551f782aecf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.654960 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.655041 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92r4m\" (UniqueName: \"kubernetes.io/projected/04183e35-79b0-4c76-b538-b5b71299cd92-kube-api-access-92r4m\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.655131 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.655212 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.696005 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.760820 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50cbae0e-4bf9-41b0-8c87-b551f782aecf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.761180 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7ppd\" (UniqueName: \"kubernetes.io/projected/50cbae0e-4bf9-41b0-8c87-b551f782aecf-kube-api-access-q7ppd\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.762005 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50cbae0e-4bf9-41b0-8c87-b551f782aecf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.766569 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50cbae0e-4bf9-41b0-8c87-b551f782aecf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.767709 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50cbae0e-4bf9-41b0-8c87-b551f782aecf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.777097 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:23:20 crc kubenswrapper[4898]: E0313 14:23:20.777757 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.800293 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7ppd\" (UniqueName: \"kubernetes.io/projected/50cbae0e-4bf9-41b0-8c87-b551f782aecf-kube-api-access-q7ppd\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.862032 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.862474 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-log" containerID="cri-o://4172774e3d35f2447bfbdab8c2932d0b6b6936e1f5596cbc2f41a9105e3cedbe" gracePeriod=30 Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.862973 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-api" containerID="cri-o://6a3356dec1914d7176270b5cf94ad8c6d3be50dc70db5f963643ba9ceaa22838" gracePeriod=30 Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.882179 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.882370 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0769b03d-29b4-4519-abc7-408431328276" containerName="nova-scheduler-scheduler" containerID="cri-o://a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b" gracePeriod=30 Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.903226 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.903415 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerName="nova-metadata-log" containerID="cri-o://ebc50c2130d8c479a6a82fdc2a6a26b1d7c6228aebf42d75caeb8e90cffc4865" gracePeriod=30 Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.903978 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerName="nova-metadata-metadata" containerID="cri-o://ab8a9704929696234d2b66d5a8c2c31485aa1055e802133753ca009e70702f0a" gracePeriod=30 Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.948825 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:21 crc kubenswrapper[4898]: W0313 14:23:21.513832 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50cbae0e_4bf9_41b0_8c87_b551f782aecf.slice/crio-cdecfdf75d36df2ca849b21d053f771e4d9b2a154e706d876c8e32ef8ad578be WatchSource:0}: Error finding container cdecfdf75d36df2ca849b21d053f771e4d9b2a154e706d876c8e32ef8ad578be: Status 404 returned error can't find the container with id cdecfdf75d36df2ca849b21d053f771e4d9b2a154e706d876c8e32ef8ad578be Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.514798 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.664833 4898 generic.go:334] "Generic (PLEG): container finished" podID="401e6738-93d7-40d4-867e-8c68437cbad3" containerID="4172774e3d35f2447bfbdab8c2932d0b6b6936e1f5596cbc2f41a9105e3cedbe" exitCode=143 Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.664912 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401e6738-93d7-40d4-867e-8c68437cbad3","Type":"ContainerDied","Data":"4172774e3d35f2447bfbdab8c2932d0b6b6936e1f5596cbc2f41a9105e3cedbe"} Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.666483 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerStarted","Data":"74f0d5a974f39e0ea6ad714deeedb29ad931cd8927cfe42cce60108477a37a86"} Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.668416 4898 generic.go:334] "Generic (PLEG): container finished" podID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerID="ab8a9704929696234d2b66d5a8c2c31485aa1055e802133753ca009e70702f0a" exitCode=0 Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.668457 4898 generic.go:334] "Generic (PLEG): container finished" podID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerID="ebc50c2130d8c479a6a82fdc2a6a26b1d7c6228aebf42d75caeb8e90cffc4865" exitCode=143 Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.668492 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f31df8cc-85c8-4626-ab54-1a93d291f02d","Type":"ContainerDied","Data":"ab8a9704929696234d2b66d5a8c2c31485aa1055e802133753ca009e70702f0a"} Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.668512 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f31df8cc-85c8-4626-ab54-1a93d291f02d","Type":"ContainerDied","Data":"ebc50c2130d8c479a6a82fdc2a6a26b1d7c6228aebf42d75caeb8e90cffc4865"} Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.677868 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"50cbae0e-4bf9-41b0-8c87-b551f782aecf","Type":"ContainerStarted","Data":"cdecfdf75d36df2ca849b21d053f771e4d9b2a154e706d876c8e32ef8ad578be"} Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.954243 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.005056 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-combined-ca-bundle\") pod \"f31df8cc-85c8-4626-ab54-1a93d291f02d\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.005165 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31df8cc-85c8-4626-ab54-1a93d291f02d-logs\") pod \"f31df8cc-85c8-4626-ab54-1a93d291f02d\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.005359 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-nova-metadata-tls-certs\") pod \"f31df8cc-85c8-4626-ab54-1a93d291f02d\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.005423 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-config-data\") pod \"f31df8cc-85c8-4626-ab54-1a93d291f02d\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.005499 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvb8h\" (UniqueName: \"kubernetes.io/projected/f31df8cc-85c8-4626-ab54-1a93d291f02d-kube-api-access-pvb8h\") pod \"f31df8cc-85c8-4626-ab54-1a93d291f02d\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.005610 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f31df8cc-85c8-4626-ab54-1a93d291f02d-logs" (OuterVolumeSpecName: "logs") pod "f31df8cc-85c8-4626-ab54-1a93d291f02d" (UID: "f31df8cc-85c8-4626-ab54-1a93d291f02d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.012581 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31df8cc-85c8-4626-ab54-1a93d291f02d-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.017273 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f31df8cc-85c8-4626-ab54-1a93d291f02d-kube-api-access-pvb8h" (OuterVolumeSpecName: "kube-api-access-pvb8h") pod "f31df8cc-85c8-4626-ab54-1a93d291f02d" (UID: "f31df8cc-85c8-4626-ab54-1a93d291f02d"). InnerVolumeSpecName "kube-api-access-pvb8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.079273 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f31df8cc-85c8-4626-ab54-1a93d291f02d" (UID: "f31df8cc-85c8-4626-ab54-1a93d291f02d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.093501 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-config-data" (OuterVolumeSpecName: "config-data") pod "f31df8cc-85c8-4626-ab54-1a93d291f02d" (UID: "f31df8cc-85c8-4626-ab54-1a93d291f02d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.102010 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f31df8cc-85c8-4626-ab54-1a93d291f02d" (UID: "f31df8cc-85c8-4626-ab54-1a93d291f02d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.115145 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.115191 4898 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.115204 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.115212 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvb8h\" (UniqueName: \"kubernetes.io/projected/f31df8cc-85c8-4626-ab54-1a93d291f02d-kube-api-access-pvb8h\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.595328 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.692451 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f31df8cc-85c8-4626-ab54-1a93d291f02d","Type":"ContainerDied","Data":"113e1d3b19705c253801494357ca106e72aa1f2de77f8992627258135742aa53"} Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.692512 4898 scope.go:117] "RemoveContainer" containerID="ab8a9704929696234d2b66d5a8c2c31485aa1055e802133753ca009e70702f0a" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.692515 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.700356 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"50cbae0e-4bf9-41b0-8c87-b551f782aecf","Type":"ContainerStarted","Data":"1d14da9968bc6e97bcfbe55c9caf15ce609e48be0d9dbf281507a3e38bdfb77b"} Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.703855 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerStarted","Data":"3bed3d1e809327f9581d0f282bdba9c8f4826f7960e98f95fb41d7bf1c602dc9"} Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.724325 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.724298754 podStartE2EDuration="2.724298754s" podCreationTimestamp="2026-03-13 14:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:22.714270184 +0000 UTC m=+1637.715858423" watchObservedRunningTime="2026-03-13 14:23:22.724298754 +0000 UTC m=+1637.725886993" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.751238 4898 scope.go:117] "RemoveContainer" containerID="ebc50c2130d8c479a6a82fdc2a6a26b1d7c6228aebf42d75caeb8e90cffc4865" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.774811 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.800682 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.816114 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:22 crc kubenswrapper[4898]: E0313 14:23:22.817085 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerName="nova-metadata-log" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.817105 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerName="nova-metadata-log" Mar 13 14:23:22 crc kubenswrapper[4898]: E0313 14:23:22.817128 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerName="nova-metadata-metadata" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.817138 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerName="nova-metadata-metadata" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.817498 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerName="nova-metadata-metadata" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.817528 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerName="nova-metadata-log" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.822319 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.825840 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.826214 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.841881 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.935996 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.936677 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8db736b-00b7-4251-a667-3b2138c6c928-logs\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.936818 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-config-data\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.936875 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg4w5\" (UniqueName: \"kubernetes.io/projected/a8db736b-00b7-4251-a667-3b2138c6c928-kube-api-access-tg4w5\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.936920 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.039757 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg4w5\" (UniqueName: \"kubernetes.io/projected/a8db736b-00b7-4251-a667-3b2138c6c928-kube-api-access-tg4w5\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.039804 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.039993 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.040041 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8db736b-00b7-4251-a667-3b2138c6c928-logs\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.040074 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-config-data\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.040944 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8db736b-00b7-4251-a667-3b2138c6c928-logs\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.060104 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-config-data\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.060160 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.060427 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.062455 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg4w5\" (UniqueName: \"kubernetes.io/projected/a8db736b-00b7-4251-a667-3b2138c6c928-kube-api-access-tg4w5\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.139999 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.682310 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:23 crc kubenswrapper[4898]: W0313 14:23:23.685207 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8db736b_00b7_4251_a667_3b2138c6c928.slice/crio-2f2f835653ef4ac86c3f5f419dc3f2bdfd6ae25a7d99a1468342cb1b296536ca WatchSource:0}: Error finding container 2f2f835653ef4ac86c3f5f419dc3f2bdfd6ae25a7d99a1468342cb1b296536ca: Status 404 returned error can't find the container with id 2f2f835653ef4ac86c3f5f419dc3f2bdfd6ae25a7d99a1468342cb1b296536ca Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.720408 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8db736b-00b7-4251-a667-3b2138c6c928","Type":"ContainerStarted","Data":"2f2f835653ef4ac86c3f5f419dc3f2bdfd6ae25a7d99a1468342cb1b296536ca"} Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.722021 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.758626 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" path="/var/lib/kubelet/pods/f31df8cc-85c8-4626-ab54-1a93d291f02d/volumes" Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.025365 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.025414 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.745329 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8db736b-00b7-4251-a667-3b2138c6c928","Type":"ContainerStarted","Data":"9bd9f3f02e15571b11f72778527812906d168be88196cc4314aa88f5c276ac6c"} Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.745877 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8db736b-00b7-4251-a667-3b2138c6c928","Type":"ContainerStarted","Data":"c0fcc6916c9c7951ac6f57b54e64b861fe8be03a65443f0a0008c4f458405d78"} Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.751565 4898 generic.go:334] "Generic (PLEG): container finished" podID="401e6738-93d7-40d4-867e-8c68437cbad3" containerID="6a3356dec1914d7176270b5cf94ad8c6d3be50dc70db5f963643ba9ceaa22838" exitCode=0 Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.751644 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401e6738-93d7-40d4-867e-8c68437cbad3","Type":"ContainerDied","Data":"6a3356dec1914d7176270b5cf94ad8c6d3be50dc70db5f963643ba9ceaa22838"} Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.755865 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="ceilometer-central-agent" containerID="cri-o://a52bb77ac792d2e28bb8d969def76aa20c8d38d8dddae8c183a41016f4efeb7f" gracePeriod=30 Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.756040 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="proxy-httpd" containerID="cri-o://21fd8f0e04d5275c81dea1442286d2ffac0118f2bf21754e17a48e3332d618e6" gracePeriod=30 Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.756098 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="sg-core" containerID="cri-o://3bed3d1e809327f9581d0f282bdba9c8f4826f7960e98f95fb41d7bf1c602dc9" gracePeriod=30 Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.756146 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="ceilometer-notification-agent" containerID="cri-o://74f0d5a974f39e0ea6ad714deeedb29ad931cd8927cfe42cce60108477a37a86" gracePeriod=30 Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.756938 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerStarted","Data":"21fd8f0e04d5275c81dea1442286d2ffac0118f2bf21754e17a48e3332d618e6"} Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.756987 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.822398 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.82237305 podStartE2EDuration="2.82237305s" podCreationTimestamp="2026-03-13 14:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:24.781381816 +0000 UTC m=+1639.782970065" watchObservedRunningTime="2026-03-13 14:23:24.82237305 +0000 UTC m=+1639.823961289" Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.850008 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.75603343 podStartE2EDuration="6.849988827s" podCreationTimestamp="2026-03-13 14:23:18 +0000 UTC" firstStartedPulling="2026-03-13 14:23:19.422283516 +0000 UTC m=+1634.423871755" lastFinishedPulling="2026-03-13 14:23:23.516238913 +0000 UTC m=+1638.517827152" observedRunningTime="2026-03-13 14:23:24.801702803 +0000 UTC m=+1639.803291042" watchObservedRunningTime="2026-03-13 14:23:24.849988827 +0000 UTC m=+1639.851577056" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.007986 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.133540 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-combined-ca-bundle\") pod \"401e6738-93d7-40d4-867e-8c68437cbad3\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.133602 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/401e6738-93d7-40d4-867e-8c68437cbad3-logs\") pod \"401e6738-93d7-40d4-867e-8c68437cbad3\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.133674 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z75l\" (UniqueName: \"kubernetes.io/projected/401e6738-93d7-40d4-867e-8c68437cbad3-kube-api-access-2z75l\") pod \"401e6738-93d7-40d4-867e-8c68437cbad3\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.133806 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-config-data\") pod \"401e6738-93d7-40d4-867e-8c68437cbad3\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.134612 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/401e6738-93d7-40d4-867e-8c68437cbad3-logs" (OuterVolumeSpecName: "logs") pod "401e6738-93d7-40d4-867e-8c68437cbad3" (UID: "401e6738-93d7-40d4-867e-8c68437cbad3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.148118 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/401e6738-93d7-40d4-867e-8c68437cbad3-kube-api-access-2z75l" (OuterVolumeSpecName: "kube-api-access-2z75l") pod "401e6738-93d7-40d4-867e-8c68437cbad3" (UID: "401e6738-93d7-40d4-867e-8c68437cbad3"). InnerVolumeSpecName "kube-api-access-2z75l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.168111 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "401e6738-93d7-40d4-867e-8c68437cbad3" (UID: "401e6738-93d7-40d4-867e-8c68437cbad3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.191940 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-config-data" (OuterVolumeSpecName: "config-data") pod "401e6738-93d7-40d4-867e-8c68437cbad3" (UID: "401e6738-93d7-40d4-867e-8c68437cbad3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.238564 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.238601 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.238614 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/401e6738-93d7-40d4-867e-8c68437cbad3-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.238623 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z75l\" (UniqueName: \"kubernetes.io/projected/401e6738-93d7-40d4-867e-8c68437cbad3-kube-api-access-2z75l\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.775044 4898 generic.go:334] "Generic (PLEG): container finished" podID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerID="21fd8f0e04d5275c81dea1442286d2ffac0118f2bf21754e17a48e3332d618e6" exitCode=0 Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.775083 4898 generic.go:334] "Generic (PLEG): container finished" podID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerID="3bed3d1e809327f9581d0f282bdba9c8f4826f7960e98f95fb41d7bf1c602dc9" exitCode=2 Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.775092 4898 generic.go:334] "Generic (PLEG): container finished" podID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerID="74f0d5a974f39e0ea6ad714deeedb29ad931cd8927cfe42cce60108477a37a86" exitCode=0 Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.775144 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerDied","Data":"21fd8f0e04d5275c81dea1442286d2ffac0118f2bf21754e17a48e3332d618e6"} Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.775181 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerDied","Data":"3bed3d1e809327f9581d0f282bdba9c8f4826f7960e98f95fb41d7bf1c602dc9"} Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.775193 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerDied","Data":"74f0d5a974f39e0ea6ad714deeedb29ad931cd8927cfe42cce60108477a37a86"} Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.780042 4898 generic.go:334] "Generic (PLEG): container finished" podID="0769b03d-29b4-4519-abc7-408431328276" containerID="a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b" exitCode=0 Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.780115 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0769b03d-29b4-4519-abc7-408431328276","Type":"ContainerDied","Data":"a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b"} Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.780683 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0769b03d-29b4-4519-abc7-408431328276","Type":"ContainerDied","Data":"8afa6be1221f1010388ebd2f7d552a490a868c166c3d8a6eee8fcc24b9095e1d"} Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.780778 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8afa6be1221f1010388ebd2f7d552a490a868c166c3d8a6eee8fcc24b9095e1d" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.784138 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.785746 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401e6738-93d7-40d4-867e-8c68437cbad3","Type":"ContainerDied","Data":"33c80887acf46993b60950d3ec27771c17d7d31947de5c0902c67049c9927696"} Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.785916 4898 scope.go:117] "RemoveContainer" containerID="6a3356dec1914d7176270b5cf94ad8c6d3be50dc70db5f963643ba9ceaa22838" Mar 13 14:23:25 crc kubenswrapper[4898]: E0313 14:23:25.802983 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b is running failed: container process not found" containerID="a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 14:23:25 crc kubenswrapper[4898]: E0313 14:23:25.803371 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b is running failed: container process not found" containerID="a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 14:23:25 crc kubenswrapper[4898]: E0313 14:23:25.803590 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b is running failed: container process not found" containerID="a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 14:23:25 crc kubenswrapper[4898]: E0313 14:23:25.803618 4898 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0769b03d-29b4-4519-abc7-408431328276" containerName="nova-scheduler-scheduler" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.882938 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.886506 4898 scope.go:117] "RemoveContainer" containerID="4172774e3d35f2447bfbdab8c2932d0b6b6936e1f5596cbc2f41a9105e3cedbe" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.909734 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.937839 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.956155 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:25 crc kubenswrapper[4898]: E0313 14:23:25.956865 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-log" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.956883 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-log" Mar 13 14:23:25 crc kubenswrapper[4898]: E0313 14:23:25.956944 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0769b03d-29b4-4519-abc7-408431328276" containerName="nova-scheduler-scheduler" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.956950 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0769b03d-29b4-4519-abc7-408431328276" containerName="nova-scheduler-scheduler" Mar 13 14:23:25 crc kubenswrapper[4898]: E0313 14:23:25.956969 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-api" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.956975 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-api" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.957283 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-api" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.957310 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-log" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.957329 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0769b03d-29b4-4519-abc7-408431328276" containerName="nova-scheduler-scheduler" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.959547 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.962312 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.003940 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.065527 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-combined-ca-bundle\") pod \"0769b03d-29b4-4519-abc7-408431328276\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.065665 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-config-data\") pod \"0769b03d-29b4-4519-abc7-408431328276\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.065878 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2dlp\" (UniqueName: \"kubernetes.io/projected/0769b03d-29b4-4519-abc7-408431328276-kube-api-access-k2dlp\") pod \"0769b03d-29b4-4519-abc7-408431328276\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.066195 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.066303 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-config-data\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.066396 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d28474-f268-4ecf-96b7-5a5007e715c3-logs\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.066532 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chxlk\" (UniqueName: \"kubernetes.io/projected/91d28474-f268-4ecf-96b7-5a5007e715c3-kube-api-access-chxlk\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.072614 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0769b03d-29b4-4519-abc7-408431328276-kube-api-access-k2dlp" (OuterVolumeSpecName: "kube-api-access-k2dlp") pod "0769b03d-29b4-4519-abc7-408431328276" (UID: "0769b03d-29b4-4519-abc7-408431328276"). InnerVolumeSpecName "kube-api-access-k2dlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.108735 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0769b03d-29b4-4519-abc7-408431328276" (UID: "0769b03d-29b4-4519-abc7-408431328276"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.146413 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-config-data" (OuterVolumeSpecName: "config-data") pod "0769b03d-29b4-4519-abc7-408431328276" (UID: "0769b03d-29b4-4519-abc7-408431328276"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.168357 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d28474-f268-4ecf-96b7-5a5007e715c3-logs\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.168739 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chxlk\" (UniqueName: \"kubernetes.io/projected/91d28474-f268-4ecf-96b7-5a5007e715c3-kube-api-access-chxlk\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.168765 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d28474-f268-4ecf-96b7-5a5007e715c3-logs\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.168804 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.168864 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-config-data\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.168978 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2dlp\" (UniqueName: \"kubernetes.io/projected/0769b03d-29b4-4519-abc7-408431328276-kube-api-access-k2dlp\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.169009 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.169019 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.172360 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.174509 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-config-data\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.185604 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chxlk\" (UniqueName: \"kubernetes.io/projected/91d28474-f268-4ecf-96b7-5a5007e715c3-kube-api-access-chxlk\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.285921 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.800708 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.806084 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.964989 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.983499 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.997985 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.999872 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.002470 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.019602 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.091710 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26jcq\" (UniqueName: \"kubernetes.io/projected/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-kube-api-access-26jcq\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.091978 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-config-data\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.092083 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.194404 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26jcq\" (UniqueName: \"kubernetes.io/projected/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-kube-api-access-26jcq\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.194527 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-config-data\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.194567 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.198360 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-config-data\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.198440 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.211668 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26jcq\" (UniqueName: \"kubernetes.io/projected/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-kube-api-access-26jcq\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.331005 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.752858 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0769b03d-29b4-4519-abc7-408431328276" path="/var/lib/kubelet/pods/0769b03d-29b4-4519-abc7-408431328276/volumes" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.753853 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" path="/var/lib/kubelet/pods/401e6738-93d7-40d4-867e-8c68437cbad3/volumes" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.821433 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91d28474-f268-4ecf-96b7-5a5007e715c3","Type":"ContainerStarted","Data":"1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d"} Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.821485 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91d28474-f268-4ecf-96b7-5a5007e715c3","Type":"ContainerStarted","Data":"534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4"} Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.821499 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91d28474-f268-4ecf-96b7-5a5007e715c3","Type":"ContainerStarted","Data":"d9d2e156d0dffe3d5895df3f9ced875368d41389028a4139a826d6dfa5cc7fb1"} Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.845769 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.8457381550000003 podStartE2EDuration="2.845738155s" podCreationTimestamp="2026-03-13 14:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:27.845233112 +0000 UTC m=+1642.846821381" watchObservedRunningTime="2026-03-13 14:23:27.845738155 +0000 UTC m=+1642.847326394" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.889682 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:28 crc kubenswrapper[4898]: I0313 14:23:28.836146 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9cfb3db3-7d46-4ab5-aecc-00ddd738d359","Type":"ContainerStarted","Data":"996fc46edbed8602f9eda3a09dc63ac36038779496a63c3ff3ba77a3b3a9e5b0"} Mar 13 14:23:28 crc kubenswrapper[4898]: I0313 14:23:28.836484 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9cfb3db3-7d46-4ab5-aecc-00ddd738d359","Type":"ContainerStarted","Data":"4fb43fc1071513c1a034ec9b0dda28c7b02d6ddc884f93f9664159fa9c0ff74c"} Mar 13 14:23:28 crc kubenswrapper[4898]: I0313 14:23:28.860442 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.860420735 podStartE2EDuration="2.860420735s" podCreationTimestamp="2026-03-13 14:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:28.856542494 +0000 UTC m=+1643.858130753" watchObservedRunningTime="2026-03-13 14:23:28.860420735 +0000 UTC m=+1643.862008984" Mar 13 14:23:30 crc kubenswrapper[4898]: I0313 14:23:30.855780 4898 generic.go:334] "Generic (PLEG): container finished" podID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerID="a52bb77ac792d2e28bb8d969def76aa20c8d38d8dddae8c183a41016f4efeb7f" exitCode=0 Mar 13 14:23:30 crc kubenswrapper[4898]: I0313 14:23:30.855859 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerDied","Data":"a52bb77ac792d2e28bb8d969def76aa20c8d38d8dddae8c183a41016f4efeb7f"} Mar 13 14:23:30 crc kubenswrapper[4898]: I0313 14:23:30.986529 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.470989 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.522021 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-run-httpd\") pod \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.522141 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pznzq\" (UniqueName: \"kubernetes.io/projected/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-kube-api-access-pznzq\") pod \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.522209 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-log-httpd\") pod \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.522258 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-combined-ca-bundle\") pod \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.522402 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-scripts\") pod \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.522566 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-config-data\") pod \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.522628 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-sg-core-conf-yaml\") pod \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.525331 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" (UID: "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.525667 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" (UID: "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.530137 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-kube-api-access-pznzq" (OuterVolumeSpecName: "kube-api-access-pznzq") pod "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" (UID: "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5"). InnerVolumeSpecName "kube-api-access-pznzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.532189 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-scripts" (OuterVolumeSpecName: "scripts") pod "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" (UID: "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.565437 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" (UID: "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.626708 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.626750 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.626819 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.626832 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pznzq\" (UniqueName: \"kubernetes.io/projected/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-kube-api-access-pznzq\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.626843 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.641144 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" (UID: "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.657472 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-config-data" (OuterVolumeSpecName: "config-data") pod "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" (UID: "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.729176 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.729364 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.870124 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerDied","Data":"1e1a0e67a834a696d965eab6f5a0ade85145365bbc3c3bccceb0280541a0e5ed"} Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.870243 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.870259 4898 scope.go:117] "RemoveContainer" containerID="21fd8f0e04d5275c81dea1442286d2ffac0118f2bf21754e17a48e3332d618e6" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.899313 4898 scope.go:117] "RemoveContainer" containerID="3bed3d1e809327f9581d0f282bdba9c8f4826f7960e98f95fb41d7bf1c602dc9" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.909101 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.924276 4898 scope.go:117] "RemoveContainer" containerID="74f0d5a974f39e0ea6ad714deeedb29ad931cd8927cfe42cce60108477a37a86" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.932876 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.952296 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:31 crc kubenswrapper[4898]: E0313 14:23:31.953736 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="ceilometer-notification-agent" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.953770 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="ceilometer-notification-agent" Mar 13 14:23:31 crc kubenswrapper[4898]: E0313 14:23:31.953796 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="proxy-httpd" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.953804 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="proxy-httpd" Mar 13 14:23:31 crc kubenswrapper[4898]: E0313 14:23:31.953821 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="ceilometer-central-agent" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.953828 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="ceilometer-central-agent" Mar 13 14:23:31 crc kubenswrapper[4898]: E0313 14:23:31.953843 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="sg-core" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.953851 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="sg-core" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.954216 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="proxy-httpd" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.954259 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="ceilometer-central-agent" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.954275 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="sg-core" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.954295 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="ceilometer-notification-agent" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.958342 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.961204 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.963287 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.969991 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.971738 4898 scope.go:117] "RemoveContainer" containerID="a52bb77ac792d2e28bb8d969def76aa20c8d38d8dddae8c183a41016f4efeb7f" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.038466 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9lvm\" (UniqueName: \"kubernetes.io/projected/fee0030e-ceb6-41ff-97b2-6302e2bed961-kube-api-access-k9lvm\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.039023 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-log-httpd\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.039149 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-config-data\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.039757 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-run-httpd\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.039884 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-scripts\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.039963 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.039996 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.142553 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-run-httpd\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.142632 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-scripts\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.142662 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.142685 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.142719 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9lvm\" (UniqueName: \"kubernetes.io/projected/fee0030e-ceb6-41ff-97b2-6302e2bed961-kube-api-access-k9lvm\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.142838 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-log-httpd\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.142866 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-config-data\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.143294 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-run-httpd\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.144565 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-log-httpd\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.150427 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-scripts\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.151495 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.152242 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-config-data\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.152598 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.163750 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9lvm\" (UniqueName: \"kubernetes.io/projected/fee0030e-ceb6-41ff-97b2-6302e2bed961-kube-api-access-k9lvm\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.288638 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.310081 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-crnr4"] Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.311867 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.321626 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-fc6a-account-create-update-cfl2q"] Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.323142 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.331306 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.331567 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.344377 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-fc6a-account-create-update-cfl2q"] Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.348464 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d29378e-424d-4831-baf4-b59a75072097-operator-scripts\") pod \"aodh-fc6a-account-create-update-cfl2q\" (UID: \"4d29378e-424d-4831-baf4-b59a75072097\") " pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.348592 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e50eec10-99ce-4611-8cf4-8f4999146339-operator-scripts\") pod \"aodh-db-create-crnr4\" (UID: \"e50eec10-99ce-4611-8cf4-8f4999146339\") " pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.348672 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2srt\" (UniqueName: \"kubernetes.io/projected/e50eec10-99ce-4611-8cf4-8f4999146339-kube-api-access-r2srt\") pod \"aodh-db-create-crnr4\" (UID: \"e50eec10-99ce-4611-8cf4-8f4999146339\") " pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.348730 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5n26\" (UniqueName: \"kubernetes.io/projected/4d29378e-424d-4831-baf4-b59a75072097-kube-api-access-x5n26\") pod \"aodh-fc6a-account-create-update-cfl2q\" (UID: \"4d29378e-424d-4831-baf4-b59a75072097\") " pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.363068 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-crnr4"] Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.452140 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d29378e-424d-4831-baf4-b59a75072097-operator-scripts\") pod \"aodh-fc6a-account-create-update-cfl2q\" (UID: \"4d29378e-424d-4831-baf4-b59a75072097\") " pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.452564 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e50eec10-99ce-4611-8cf4-8f4999146339-operator-scripts\") pod \"aodh-db-create-crnr4\" (UID: \"e50eec10-99ce-4611-8cf4-8f4999146339\") " pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.453828 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2srt\" (UniqueName: \"kubernetes.io/projected/e50eec10-99ce-4611-8cf4-8f4999146339-kube-api-access-r2srt\") pod \"aodh-db-create-crnr4\" (UID: \"e50eec10-99ce-4611-8cf4-8f4999146339\") " pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.453883 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5n26\" (UniqueName: \"kubernetes.io/projected/4d29378e-424d-4831-baf4-b59a75072097-kube-api-access-x5n26\") pod \"aodh-fc6a-account-create-update-cfl2q\" (UID: \"4d29378e-424d-4831-baf4-b59a75072097\") " pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.455433 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d29378e-424d-4831-baf4-b59a75072097-operator-scripts\") pod \"aodh-fc6a-account-create-update-cfl2q\" (UID: \"4d29378e-424d-4831-baf4-b59a75072097\") " pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.455875 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e50eec10-99ce-4611-8cf4-8f4999146339-operator-scripts\") pod \"aodh-db-create-crnr4\" (UID: \"e50eec10-99ce-4611-8cf4-8f4999146339\") " pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.481430 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2srt\" (UniqueName: \"kubernetes.io/projected/e50eec10-99ce-4611-8cf4-8f4999146339-kube-api-access-r2srt\") pod \"aodh-db-create-crnr4\" (UID: \"e50eec10-99ce-4611-8cf4-8f4999146339\") " pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.489866 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5n26\" (UniqueName: \"kubernetes.io/projected/4d29378e-424d-4831-baf4-b59a75072097-kube-api-access-x5n26\") pod \"aodh-fc6a-account-create-update-cfl2q\" (UID: \"4d29378e-424d-4831-baf4-b59a75072097\") " pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.648488 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.662115 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:32 crc kubenswrapper[4898]: W0313 14:23:32.862244 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee0030e_ceb6_41ff_97b2_6302e2bed961.slice/crio-7938adbffcaeda1984e7ee26b38983054f8dc341e6b57108f491823714875f79 WatchSource:0}: Error finding container 7938adbffcaeda1984e7ee26b38983054f8dc341e6b57108f491823714875f79: Status 404 returned error can't find the container with id 7938adbffcaeda1984e7ee26b38983054f8dc341e6b57108f491823714875f79 Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.862506 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.911180 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerStarted","Data":"7938adbffcaeda1984e7ee26b38983054f8dc341e6b57108f491823714875f79"} Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.140954 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.141005 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.181712 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-crnr4"] Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.360841 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-fc6a-account-create-update-cfl2q"] Mar 13 14:23:33 crc kubenswrapper[4898]: W0313 14:23:33.379151 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d29378e_424d_4831_baf4_b59a75072097.slice/crio-986f1ec2eac42d15ffcd477dd0aebc1a00f6cb5abbf86cdc061cc766a9aaa72a WatchSource:0}: Error finding container 986f1ec2eac42d15ffcd477dd0aebc1a00f6cb5abbf86cdc061cc766a9aaa72a: Status 404 returned error can't find the container with id 986f1ec2eac42d15ffcd477dd0aebc1a00f6cb5abbf86cdc061cc766a9aaa72a Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.739882 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:23:33 crc kubenswrapper[4898]: E0313 14:23:33.740360 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.752606 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" path="/var/lib/kubelet/pods/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5/volumes" Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.923313 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-fc6a-account-create-update-cfl2q" event={"ID":"4d29378e-424d-4831-baf4-b59a75072097","Type":"ContainerStarted","Data":"4c3017aeb6114b905e5e159773ee55011a6eea4d13e889167f127e1524ada66d"} Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.924334 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-fc6a-account-create-update-cfl2q" event={"ID":"4d29378e-424d-4831-baf4-b59a75072097","Type":"ContainerStarted","Data":"986f1ec2eac42d15ffcd477dd0aebc1a00f6cb5abbf86cdc061cc766a9aaa72a"} Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.927123 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerStarted","Data":"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f"} Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.929545 4898 generic.go:334] "Generic (PLEG): container finished" podID="e50eec10-99ce-4611-8cf4-8f4999146339" containerID="903eebacfbe4709488e6b56c6ba47deec7c9d806d35c4763b770e46f79ef165a" exitCode=0 Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.929642 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-crnr4" event={"ID":"e50eec10-99ce-4611-8cf4-8f4999146339","Type":"ContainerDied","Data":"903eebacfbe4709488e6b56c6ba47deec7c9d806d35c4763b770e46f79ef165a"} Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.929676 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-crnr4" event={"ID":"e50eec10-99ce-4611-8cf4-8f4999146339","Type":"ContainerStarted","Data":"bee3105929f09bbd68f86f9a82c8813e95620fc80d9be672ee7f18dca5bd32d2"} Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.944010 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-fc6a-account-create-update-cfl2q" podStartSLOduration=1.943992083 podStartE2EDuration="1.943992083s" podCreationTimestamp="2026-03-13 14:23:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:33.939738942 +0000 UTC m=+1648.941327211" watchObservedRunningTime="2026-03-13 14:23:33.943992083 +0000 UTC m=+1648.945580332" Mar 13 14:23:34 crc kubenswrapper[4898]: I0313 14:23:34.165191 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.2:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:23:34 crc kubenswrapper[4898]: I0313 14:23:34.165494 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.2:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:23:34 crc kubenswrapper[4898]: I0313 14:23:34.944411 4898 generic.go:334] "Generic (PLEG): container finished" podID="4d29378e-424d-4831-baf4-b59a75072097" containerID="4c3017aeb6114b905e5e159773ee55011a6eea4d13e889167f127e1524ada66d" exitCode=0 Mar 13 14:23:34 crc kubenswrapper[4898]: I0313 14:23:34.944550 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-fc6a-account-create-update-cfl2q" event={"ID":"4d29378e-424d-4831-baf4-b59a75072097","Type":"ContainerDied","Data":"4c3017aeb6114b905e5e159773ee55011a6eea4d13e889167f127e1524ada66d"} Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.481150 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.594767 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e50eec10-99ce-4611-8cf4-8f4999146339-operator-scripts\") pod \"e50eec10-99ce-4611-8cf4-8f4999146339\" (UID: \"e50eec10-99ce-4611-8cf4-8f4999146339\") " Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.595121 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2srt\" (UniqueName: \"kubernetes.io/projected/e50eec10-99ce-4611-8cf4-8f4999146339-kube-api-access-r2srt\") pod \"e50eec10-99ce-4611-8cf4-8f4999146339\" (UID: \"e50eec10-99ce-4611-8cf4-8f4999146339\") " Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.596793 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e50eec10-99ce-4611-8cf4-8f4999146339-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e50eec10-99ce-4611-8cf4-8f4999146339" (UID: "e50eec10-99ce-4611-8cf4-8f4999146339"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.611110 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50eec10-99ce-4611-8cf4-8f4999146339-kube-api-access-r2srt" (OuterVolumeSpecName: "kube-api-access-r2srt") pod "e50eec10-99ce-4611-8cf4-8f4999146339" (UID: "e50eec10-99ce-4611-8cf4-8f4999146339"). InnerVolumeSpecName "kube-api-access-r2srt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.698707 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e50eec10-99ce-4611-8cf4-8f4999146339-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.699032 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2srt\" (UniqueName: \"kubernetes.io/projected/e50eec10-99ce-4611-8cf4-8f4999146339-kube-api-access-r2srt\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.963195 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerStarted","Data":"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e"} Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.967391 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.967781 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-crnr4" event={"ID":"e50eec10-99ce-4611-8cf4-8f4999146339","Type":"ContainerDied","Data":"bee3105929f09bbd68f86f9a82c8813e95620fc80d9be672ee7f18dca5bd32d2"} Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.967819 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bee3105929f09bbd68f86f9a82c8813e95620fc80d9be672ee7f18dca5bd32d2" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.286617 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.287045 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.538424 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.628116 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d29378e-424d-4831-baf4-b59a75072097-operator-scripts\") pod \"4d29378e-424d-4831-baf4-b59a75072097\" (UID: \"4d29378e-424d-4831-baf4-b59a75072097\") " Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.628216 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5n26\" (UniqueName: \"kubernetes.io/projected/4d29378e-424d-4831-baf4-b59a75072097-kube-api-access-x5n26\") pod \"4d29378e-424d-4831-baf4-b59a75072097\" (UID: \"4d29378e-424d-4831-baf4-b59a75072097\") " Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.628753 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d29378e-424d-4831-baf4-b59a75072097-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d29378e-424d-4831-baf4-b59a75072097" (UID: "4d29378e-424d-4831-baf4-b59a75072097"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.641673 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d29378e-424d-4831-baf4-b59a75072097-kube-api-access-x5n26" (OuterVolumeSpecName: "kube-api-access-x5n26") pod "4d29378e-424d-4831-baf4-b59a75072097" (UID: "4d29378e-424d-4831-baf4-b59a75072097"). InnerVolumeSpecName "kube-api-access-x5n26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.730543 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d29378e-424d-4831-baf4-b59a75072097-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.730577 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5n26\" (UniqueName: \"kubernetes.io/projected/4d29378e-424d-4831-baf4-b59a75072097-kube-api-access-x5n26\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.983413 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerStarted","Data":"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772"} Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.986732 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-fc6a-account-create-update-cfl2q" event={"ID":"4d29378e-424d-4831-baf4-b59a75072097","Type":"ContainerDied","Data":"986f1ec2eac42d15ffcd477dd0aebc1a00f6cb5abbf86cdc061cc766a9aaa72a"} Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.987229 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="986f1ec2eac42d15ffcd477dd0aebc1a00f6cb5abbf86cdc061cc766a9aaa72a" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.987553 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:37 crc kubenswrapper[4898]: I0313 14:23:37.331664 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 14:23:37 crc kubenswrapper[4898]: I0313 14:23:37.369266 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.3:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 14:23:37 crc kubenswrapper[4898]: I0313 14:23:37.369267 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.3:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 14:23:37 crc kubenswrapper[4898]: I0313 14:23:37.379589 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 14:23:38 crc kubenswrapper[4898]: I0313 14:23:38.041138 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 14:23:39 crc kubenswrapper[4898]: I0313 14:23:39.011164 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerStarted","Data":"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61"} Mar 13 14:23:39 crc kubenswrapper[4898]: I0313 14:23:39.011776 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:23:39 crc kubenswrapper[4898]: I0313 14:23:39.042363 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.997324747 podStartE2EDuration="8.042329424s" podCreationTimestamp="2026-03-13 14:23:31 +0000 UTC" firstStartedPulling="2026-03-13 14:23:32.869195151 +0000 UTC m=+1647.870783400" lastFinishedPulling="2026-03-13 14:23:37.914199818 +0000 UTC m=+1652.915788077" observedRunningTime="2026-03-13 14:23:39.036531143 +0000 UTC m=+1654.038119382" watchObservedRunningTime="2026-03-13 14:23:39.042329424 +0000 UTC m=+1654.043917683" Mar 13 14:23:41 crc kubenswrapper[4898]: I0313 14:23:41.140153 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 14:23:41 crc kubenswrapper[4898]: I0313 14:23:41.140477 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.643088 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-rr9kw"] Mar 13 14:23:42 crc kubenswrapper[4898]: E0313 14:23:42.644732 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d29378e-424d-4831-baf4-b59a75072097" containerName="mariadb-account-create-update" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.644840 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d29378e-424d-4831-baf4-b59a75072097" containerName="mariadb-account-create-update" Mar 13 14:23:42 crc kubenswrapper[4898]: E0313 14:23:42.648472 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e50eec10-99ce-4611-8cf4-8f4999146339" containerName="mariadb-database-create" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.648672 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50eec10-99ce-4611-8cf4-8f4999146339" containerName="mariadb-database-create" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.649261 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e50eec10-99ce-4611-8cf4-8f4999146339" containerName="mariadb-database-create" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.649359 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d29378e-424d-4831-baf4-b59a75072097" containerName="mariadb-account-create-update" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.650523 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.655962 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.657095 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-tnpwg" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.657596 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.657667 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.662166 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-rr9kw"] Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.795248 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tdv8\" (UniqueName: \"kubernetes.io/projected/6ed6478d-e1a3-4587-813f-222e6c4e54d7-kube-api-access-9tdv8\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.795322 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-combined-ca-bundle\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.795377 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-scripts\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.796317 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-config-data\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.898682 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-config-data\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.898752 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tdv8\" (UniqueName: \"kubernetes.io/projected/6ed6478d-e1a3-4587-813f-222e6c4e54d7-kube-api-access-9tdv8\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.898805 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-combined-ca-bundle\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.898874 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-scripts\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.905434 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-scripts\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.905963 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-config-data\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.910674 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-combined-ca-bundle\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.918846 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tdv8\" (UniqueName: \"kubernetes.io/projected/6ed6478d-e1a3-4587-813f-222e6c4e54d7-kube-api-access-9tdv8\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.975987 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.146276 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.146966 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.155618 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.516055 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-rr9kw"] Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.691323 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.821383 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-config-data\") pod \"5edbf12d-a655-4822-98da-9719c131fa14\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.821491 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4l9h\" (UniqueName: \"kubernetes.io/projected/5edbf12d-a655-4822-98da-9719c131fa14-kube-api-access-p4l9h\") pod \"5edbf12d-a655-4822-98da-9719c131fa14\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.821612 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-combined-ca-bundle\") pod \"5edbf12d-a655-4822-98da-9719c131fa14\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.826745 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5edbf12d-a655-4822-98da-9719c131fa14-kube-api-access-p4l9h" (OuterVolumeSpecName: "kube-api-access-p4l9h") pod "5edbf12d-a655-4822-98da-9719c131fa14" (UID: "5edbf12d-a655-4822-98da-9719c131fa14"). InnerVolumeSpecName "kube-api-access-p4l9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.851946 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-config-data" (OuterVolumeSpecName: "config-data") pod "5edbf12d-a655-4822-98da-9719c131fa14" (UID: "5edbf12d-a655-4822-98da-9719c131fa14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.853648 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5edbf12d-a655-4822-98da-9719c131fa14" (UID: "5edbf12d-a655-4822-98da-9719c131fa14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.924716 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.924789 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4l9h\" (UniqueName: \"kubernetes.io/projected/5edbf12d-a655-4822-98da-9719c131fa14-kube-api-access-p4l9h\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.924804 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.083453 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rr9kw" event={"ID":"6ed6478d-e1a3-4587-813f-222e6c4e54d7","Type":"ContainerStarted","Data":"28400347fb0bff37d2a3d6816574065a1398407aac258da24130e69501fc45b8"} Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.085490 4898 generic.go:334] "Generic (PLEG): container finished" podID="5edbf12d-a655-4822-98da-9719c131fa14" containerID="132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430" exitCode=137 Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.085535 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5edbf12d-a655-4822-98da-9719c131fa14","Type":"ContainerDied","Data":"132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430"} Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.085578 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.085602 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5edbf12d-a655-4822-98da-9719c131fa14","Type":"ContainerDied","Data":"7a4153aaa3cfe90175a3e8e41cb9a73f5056ef4defe81a11d1b496b7bcdcc9b6"} Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.085625 4898 scope.go:117] "RemoveContainer" containerID="132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.114084 4898 scope.go:117] "RemoveContainer" containerID="132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430" Mar 13 14:23:44 crc kubenswrapper[4898]: E0313 14:23:44.118091 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430\": container with ID starting with 132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430 not found: ID does not exist" containerID="132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.118146 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430"} err="failed to get container status \"132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430\": rpc error: code = NotFound desc = could not find container \"132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430\": container with ID starting with 132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430 not found: ID does not exist" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.144962 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.162922 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.190110 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:44 crc kubenswrapper[4898]: E0313 14:23:44.190828 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5edbf12d-a655-4822-98da-9719c131fa14" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.190846 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5edbf12d-a655-4822-98da-9719c131fa14" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.191197 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5edbf12d-a655-4822-98da-9719c131fa14" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.192311 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.197452 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.197615 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.198219 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.212137 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.232129 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.232430 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pq2h\" (UniqueName: \"kubernetes.io/projected/041221f0-b346-4310-ab8e-a8f2440c6034-kube-api-access-5pq2h\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.232529 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.232681 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.232844 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.286301 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.286355 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.335941 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.336016 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.336193 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.336280 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.336400 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pq2h\" (UniqueName: \"kubernetes.io/projected/041221f0-b346-4310-ab8e-a8f2440c6034-kube-api-access-5pq2h\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.341114 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.341221 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.341710 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.344746 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.355491 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pq2h\" (UniqueName: \"kubernetes.io/projected/041221f0-b346-4310-ab8e-a8f2440c6034-kube-api-access-5pq2h\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.513408 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.985049 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:44 crc kubenswrapper[4898]: W0313 14:23:44.989075 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod041221f0_b346_4310_ab8e_a8f2440c6034.slice/crio-b8e405b492c9d7fd596598cf904fae808606c285c335d07a03fb9f84f06a5476 WatchSource:0}: Error finding container b8e405b492c9d7fd596598cf904fae808606c285c335d07a03fb9f84f06a5476: Status 404 returned error can't find the container with id b8e405b492c9d7fd596598cf904fae808606c285c335d07a03fb9f84f06a5476 Mar 13 14:23:45 crc kubenswrapper[4898]: I0313 14:23:45.102158 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"041221f0-b346-4310-ab8e-a8f2440c6034","Type":"ContainerStarted","Data":"b8e405b492c9d7fd596598cf904fae808606c285c335d07a03fb9f84f06a5476"} Mar 13 14:23:45 crc kubenswrapper[4898]: I0313 14:23:45.106804 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 14:23:45 crc kubenswrapper[4898]: I0313 14:23:45.753299 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:23:45 crc kubenswrapper[4898]: E0313 14:23:45.753889 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:23:45 crc kubenswrapper[4898]: I0313 14:23:45.755368 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5edbf12d-a655-4822-98da-9719c131fa14" path="/var/lib/kubelet/pods/5edbf12d-a655-4822-98da-9719c131fa14/volumes" Mar 13 14:23:46 crc kubenswrapper[4898]: I0313 14:23:46.116898 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"041221f0-b346-4310-ab8e-a8f2440c6034","Type":"ContainerStarted","Data":"52d82d263f9909335f5d4fa501a1e3625477391dab284f73ecfc7437d9706252"} Mar 13 14:23:46 crc kubenswrapper[4898]: I0313 14:23:46.138984 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.138954519 podStartE2EDuration="2.138954519s" podCreationTimestamp="2026-03-13 14:23:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:46.136573267 +0000 UTC m=+1661.138161506" watchObservedRunningTime="2026-03-13 14:23:46.138954519 +0000 UTC m=+1661.140542778" Mar 13 14:23:46 crc kubenswrapper[4898]: I0313 14:23:46.289924 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 14:23:46 crc kubenswrapper[4898]: I0313 14:23:46.290854 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 14:23:46 crc kubenswrapper[4898]: I0313 14:23:46.312322 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.131083 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.301805 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-bsswg"] Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.304348 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.316217 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-bsswg"] Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.417422 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-config\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.417487 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.417560 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.417636 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.417677 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg4fs\" (UniqueName: \"kubernetes.io/projected/c1d23b78-5402-47e0-8af6-851fcc71be6b-kube-api-access-zg4fs\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.417749 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.519621 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.519743 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.519826 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.519874 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg4fs\" (UniqueName: \"kubernetes.io/projected/c1d23b78-5402-47e0-8af6-851fcc71be6b-kube-api-access-zg4fs\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.519927 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.522790 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.523356 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.524309 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.525334 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.525516 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-config\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.526476 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-config\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.549380 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg4fs\" (UniqueName: \"kubernetes.io/projected/c1d23b78-5402-47e0-8af6-851fcc71be6b-kube-api-access-zg4fs\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.632137 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:48 crc kubenswrapper[4898]: W0313 14:23:48.854288 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1d23b78_5402_47e0_8af6_851fcc71be6b.slice/crio-0c1b795a0a1b5e26820a1d2e36ef2d08e255211a4eb9dbe6c50123bc0c988977 WatchSource:0}: Error finding container 0c1b795a0a1b5e26820a1d2e36ef2d08e255211a4eb9dbe6c50123bc0c988977: Status 404 returned error can't find the container with id 0c1b795a0a1b5e26820a1d2e36ef2d08e255211a4eb9dbe6c50123bc0c988977 Mar 13 14:23:48 crc kubenswrapper[4898]: I0313 14:23:48.855522 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-bsswg"] Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.176039 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" event={"ID":"c1d23b78-5402-47e0-8af6-851fcc71be6b","Type":"ContainerStarted","Data":"0c1b795a0a1b5e26820a1d2e36ef2d08e255211a4eb9dbe6c50123bc0c988977"} Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.186975 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rr9kw" event={"ID":"6ed6478d-e1a3-4587-813f-222e6c4e54d7","Type":"ContainerStarted","Data":"80f19b54532d031429e005ad3c0cf3c32cf378c8ddab795769c6087368772b42"} Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.225826 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-rr9kw" podStartSLOduration=2.357548524 podStartE2EDuration="7.225776961s" podCreationTimestamp="2026-03-13 14:23:42 +0000 UTC" firstStartedPulling="2026-03-13 14:23:43.489451238 +0000 UTC m=+1658.491039477" lastFinishedPulling="2026-03-13 14:23:48.357679675 +0000 UTC m=+1663.359267914" observedRunningTime="2026-03-13 14:23:49.215443143 +0000 UTC m=+1664.217031382" watchObservedRunningTime="2026-03-13 14:23:49.225776961 +0000 UTC m=+1664.227365220" Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.515994 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.703551 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.899228 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.899490 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="ceilometer-central-agent" containerID="cri-o://046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f" gracePeriod=30 Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.900049 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="proxy-httpd" containerID="cri-o://861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61" gracePeriod=30 Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.900309 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="sg-core" containerID="cri-o://ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772" gracePeriod=30 Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.900417 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="ceilometer-notification-agent" containerID="cri-o://f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e" gracePeriod=30 Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.908478 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.5:3000/\": EOF" Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.206472 4898 generic.go:334] "Generic (PLEG): container finished" podID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerID="861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61" exitCode=0 Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.206736 4898 generic.go:334] "Generic (PLEG): container finished" podID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerID="ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772" exitCode=2 Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.206561 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerDied","Data":"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61"} Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.206784 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerDied","Data":"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772"} Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.210650 4898 generic.go:334] "Generic (PLEG): container finished" podID="c1d23b78-5402-47e0-8af6-851fcc71be6b" containerID="7bda3f3e696b51ddd844a86367f80dfd5aee6675d49dc2996a124d70519d4d20" exitCode=0 Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.210702 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" event={"ID":"c1d23b78-5402-47e0-8af6-851fcc71be6b","Type":"ContainerDied","Data":"7bda3f3e696b51ddd844a86367f80dfd5aee6675d49dc2996a124d70519d4d20"} Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.211014 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-log" containerID="cri-o://534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4" gracePeriod=30 Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.211069 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-api" containerID="cri-o://1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d" gracePeriod=30 Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.884077 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.930408 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-run-httpd\") pod \"fee0030e-ceb6-41ff-97b2-6302e2bed961\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.930451 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-config-data\") pod \"fee0030e-ceb6-41ff-97b2-6302e2bed961\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.930561 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-sg-core-conf-yaml\") pod \"fee0030e-ceb6-41ff-97b2-6302e2bed961\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.930670 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9lvm\" (UniqueName: \"kubernetes.io/projected/fee0030e-ceb6-41ff-97b2-6302e2bed961-kube-api-access-k9lvm\") pod \"fee0030e-ceb6-41ff-97b2-6302e2bed961\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.930728 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-log-httpd\") pod \"fee0030e-ceb6-41ff-97b2-6302e2bed961\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.930774 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-scripts\") pod \"fee0030e-ceb6-41ff-97b2-6302e2bed961\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.930798 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-combined-ca-bundle\") pod \"fee0030e-ceb6-41ff-97b2-6302e2bed961\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.936510 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fee0030e-ceb6-41ff-97b2-6302e2bed961" (UID: "fee0030e-ceb6-41ff-97b2-6302e2bed961"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.938150 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fee0030e-ceb6-41ff-97b2-6302e2bed961" (UID: "fee0030e-ceb6-41ff-97b2-6302e2bed961"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.941384 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee0030e-ceb6-41ff-97b2-6302e2bed961-kube-api-access-k9lvm" (OuterVolumeSpecName: "kube-api-access-k9lvm") pod "fee0030e-ceb6-41ff-97b2-6302e2bed961" (UID: "fee0030e-ceb6-41ff-97b2-6302e2bed961"). InnerVolumeSpecName "kube-api-access-k9lvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.947090 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-scripts" (OuterVolumeSpecName: "scripts") pod "fee0030e-ceb6-41ff-97b2-6302e2bed961" (UID: "fee0030e-ceb6-41ff-97b2-6302e2bed961"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.980524 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fee0030e-ceb6-41ff-97b2-6302e2bed961" (UID: "fee0030e-ceb6-41ff-97b2-6302e2bed961"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.033826 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.034100 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.034170 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9lvm\" (UniqueName: \"kubernetes.io/projected/fee0030e-ceb6-41ff-97b2-6302e2bed961-kube-api-access-k9lvm\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.034228 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.034619 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.070839 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fee0030e-ceb6-41ff-97b2-6302e2bed961" (UID: "fee0030e-ceb6-41ff-97b2-6302e2bed961"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.124602 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-config-data" (OuterVolumeSpecName: "config-data") pod "fee0030e-ceb6-41ff-97b2-6302e2bed961" (UID: "fee0030e-ceb6-41ff-97b2-6302e2bed961"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.136975 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.137016 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.224332 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ed6478d-e1a3-4587-813f-222e6c4e54d7" containerID="80f19b54532d031429e005ad3c0cf3c32cf378c8ddab795769c6087368772b42" exitCode=0 Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.224403 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rr9kw" event={"ID":"6ed6478d-e1a3-4587-813f-222e6c4e54d7","Type":"ContainerDied","Data":"80f19b54532d031429e005ad3c0cf3c32cf378c8ddab795769c6087368772b42"} Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.226977 4898 generic.go:334] "Generic (PLEG): container finished" podID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerID="534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4" exitCode=143 Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.227030 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91d28474-f268-4ecf-96b7-5a5007e715c3","Type":"ContainerDied","Data":"534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4"} Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.230257 4898 generic.go:334] "Generic (PLEG): container finished" podID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerID="f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e" exitCode=0 Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.230472 4898 generic.go:334] "Generic (PLEG): container finished" podID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerID="046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f" exitCode=0 Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.230349 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.230377 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerDied","Data":"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e"} Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.230666 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerDied","Data":"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f"} Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.230685 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerDied","Data":"7938adbffcaeda1984e7ee26b38983054f8dc341e6b57108f491823714875f79"} Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.230705 4898 scope.go:117] "RemoveContainer" containerID="861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.233815 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" event={"ID":"c1d23b78-5402-47e0-8af6-851fcc71be6b","Type":"ContainerStarted","Data":"4e87cc2a9eb5ac3a94e1731b9ead6b0d6cdba2ef55c6b43916f80ca58fe1a32b"} Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.234052 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.282802 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" podStartSLOduration=4.282784049 podStartE2EDuration="4.282784049s" podCreationTimestamp="2026-03-13 14:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:51.271503277 +0000 UTC m=+1666.273091536" watchObservedRunningTime="2026-03-13 14:23:51.282784049 +0000 UTC m=+1666.284372288" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.332532 4898 scope.go:117] "RemoveContainer" containerID="ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.342336 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.360284 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.369286 4898 scope.go:117] "RemoveContainer" containerID="f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.378619 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:51 crc kubenswrapper[4898]: E0313 14:23:51.379236 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="proxy-httpd" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.379253 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="proxy-httpd" Mar 13 14:23:51 crc kubenswrapper[4898]: E0313 14:23:51.379261 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="sg-core" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.379267 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="sg-core" Mar 13 14:23:51 crc kubenswrapper[4898]: E0313 14:23:51.379281 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="ceilometer-central-agent" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.379286 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="ceilometer-central-agent" Mar 13 14:23:51 crc kubenswrapper[4898]: E0313 14:23:51.379323 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="ceilometer-notification-agent" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.379330 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="ceilometer-notification-agent" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.379538 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="proxy-httpd" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.379568 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="ceilometer-central-agent" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.379577 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="sg-core" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.379591 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="ceilometer-notification-agent" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.381753 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.385105 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.391395 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.404626 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.410608 4898 scope.go:117] "RemoveContainer" containerID="046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.437636 4898 scope.go:117] "RemoveContainer" containerID="861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61" Mar 13 14:23:51 crc kubenswrapper[4898]: E0313 14:23:51.438144 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61\": container with ID starting with 861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61 not found: ID does not exist" containerID="861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.438174 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61"} err="failed to get container status \"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61\": rpc error: code = NotFound desc = could not find container \"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61\": container with ID starting with 861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61 not found: ID does not exist" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.438193 4898 scope.go:117] "RemoveContainer" containerID="ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772" Mar 13 14:23:51 crc kubenswrapper[4898]: E0313 14:23:51.438487 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772\": container with ID starting with ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772 not found: ID does not exist" containerID="ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.438614 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772"} err="failed to get container status \"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772\": rpc error: code = NotFound desc = could not find container \"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772\": container with ID starting with ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772 not found: ID does not exist" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.438726 4898 scope.go:117] "RemoveContainer" containerID="f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e" Mar 13 14:23:51 crc kubenswrapper[4898]: E0313 14:23:51.439096 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e\": container with ID starting with f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e not found: ID does not exist" containerID="f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439120 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e"} err="failed to get container status \"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e\": rpc error: code = NotFound desc = could not find container \"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e\": container with ID starting with f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e not found: ID does not exist" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439133 4898 scope.go:117] "RemoveContainer" containerID="046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f" Mar 13 14:23:51 crc kubenswrapper[4898]: E0313 14:23:51.439385 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f\": container with ID starting with 046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f not found: ID does not exist" containerID="046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439408 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f"} err="failed to get container status \"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f\": rpc error: code = NotFound desc = could not find container \"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f\": container with ID starting with 046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f not found: ID does not exist" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439423 4898 scope.go:117] "RemoveContainer" containerID="861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439581 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61"} err="failed to get container status \"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61\": rpc error: code = NotFound desc = could not find container \"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61\": container with ID starting with 861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61 not found: ID does not exist" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439598 4898 scope.go:117] "RemoveContainer" containerID="ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439739 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772"} err="failed to get container status \"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772\": rpc error: code = NotFound desc = could not find container \"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772\": container with ID starting with ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772 not found: ID does not exist" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439753 4898 scope.go:117] "RemoveContainer" containerID="f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439889 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e"} err="failed to get container status \"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e\": rpc error: code = NotFound desc = could not find container \"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e\": container with ID starting with f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e not found: ID does not exist" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439920 4898 scope.go:117] "RemoveContainer" containerID="046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.440047 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f"} err="failed to get container status \"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f\": rpc error: code = NotFound desc = could not find container \"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f\": container with ID starting with 046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f not found: ID does not exist" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.442431 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-run-httpd\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.442473 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-scripts\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.442513 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-log-httpd\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.442572 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.442592 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzkqk\" (UniqueName: \"kubernetes.io/projected/15a748d6-879c-48aa-99d0-b4a02dcfb640-kube-api-access-xzkqk\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.442679 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.442710 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-config-data\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.545410 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzkqk\" (UniqueName: \"kubernetes.io/projected/15a748d6-879c-48aa-99d0-b4a02dcfb640-kube-api-access-xzkqk\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.545926 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.546094 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-config-data\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.546263 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-run-httpd\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.546369 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-scripts\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.546510 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-log-httpd\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.546683 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.547500 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-run-httpd\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.548518 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-log-httpd\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.551935 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.552518 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-scripts\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.553005 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-config-data\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.553281 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.570518 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzkqk\" (UniqueName: \"kubernetes.io/projected/15a748d6-879c-48aa-99d0-b4a02dcfb640-kube-api-access-xzkqk\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.713119 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.753122 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" path="/var/lib/kubelet/pods/fee0030e-ceb6-41ff-97b2-6302e2bed961/volumes" Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.232819 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.267149 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerStarted","Data":"3c338bdcaa4d57b4416d9ffc2b822eddf6cb4b810f647a605061d6a63e5367e7"} Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.334623 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.791337 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.986138 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-scripts\") pod \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.986891 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-config-data\") pod \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.987160 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-combined-ca-bundle\") pod \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.987226 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tdv8\" (UniqueName: \"kubernetes.io/projected/6ed6478d-e1a3-4587-813f-222e6c4e54d7-kube-api-access-9tdv8\") pod \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.991772 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-scripts" (OuterVolumeSpecName: "scripts") pod "6ed6478d-e1a3-4587-813f-222e6c4e54d7" (UID: "6ed6478d-e1a3-4587-813f-222e6c4e54d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.992084 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ed6478d-e1a3-4587-813f-222e6c4e54d7-kube-api-access-9tdv8" (OuterVolumeSpecName: "kube-api-access-9tdv8") pod "6ed6478d-e1a3-4587-813f-222e6c4e54d7" (UID: "6ed6478d-e1a3-4587-813f-222e6c4e54d7"). InnerVolumeSpecName "kube-api-access-9tdv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.020252 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ed6478d-e1a3-4587-813f-222e6c4e54d7" (UID: "6ed6478d-e1a3-4587-813f-222e6c4e54d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.036673 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-config-data" (OuterVolumeSpecName: "config-data") pod "6ed6478d-e1a3-4587-813f-222e6c4e54d7" (UID: "6ed6478d-e1a3-4587-813f-222e6c4e54d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.090423 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.090460 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.090471 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.090482 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tdv8\" (UniqueName: \"kubernetes.io/projected/6ed6478d-e1a3-4587-813f-222e6c4e54d7-kube-api-access-9tdv8\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.281247 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerStarted","Data":"8580340e4db9b90d9c91bc7b25a0fe1542d88b2764af7feeb78e17278f5ad813"} Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.283118 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rr9kw" event={"ID":"6ed6478d-e1a3-4587-813f-222e6c4e54d7","Type":"ContainerDied","Data":"28400347fb0bff37d2a3d6816574065a1398407aac258da24130e69501fc45b8"} Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.283161 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28400347fb0bff37d2a3d6816574065a1398407aac258da24130e69501fc45b8" Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.283166 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.212060 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.296133 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerStarted","Data":"d3f13a9a6e980be00fa109701dade3412bbacdebaa6ebaabd90e68e043d3f2de"} Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.298260 4898 generic.go:334] "Generic (PLEG): container finished" podID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerID="1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d" exitCode=0 Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.298304 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91d28474-f268-4ecf-96b7-5a5007e715c3","Type":"ContainerDied","Data":"1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d"} Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.298332 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91d28474-f268-4ecf-96b7-5a5007e715c3","Type":"ContainerDied","Data":"d9d2e156d0dffe3d5895df3f9ced875368d41389028a4139a826d6dfa5cc7fb1"} Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.298348 4898 scope.go:117] "RemoveContainer" containerID="1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.298506 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.318709 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-config-data\") pod \"91d28474-f268-4ecf-96b7-5a5007e715c3\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.319027 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chxlk\" (UniqueName: \"kubernetes.io/projected/91d28474-f268-4ecf-96b7-5a5007e715c3-kube-api-access-chxlk\") pod \"91d28474-f268-4ecf-96b7-5a5007e715c3\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.319068 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d28474-f268-4ecf-96b7-5a5007e715c3-logs\") pod \"91d28474-f268-4ecf-96b7-5a5007e715c3\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.319188 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-combined-ca-bundle\") pod \"91d28474-f268-4ecf-96b7-5a5007e715c3\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.320290 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91d28474-f268-4ecf-96b7-5a5007e715c3-logs" (OuterVolumeSpecName: "logs") pod "91d28474-f268-4ecf-96b7-5a5007e715c3" (UID: "91d28474-f268-4ecf-96b7-5a5007e715c3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.325933 4898 scope.go:117] "RemoveContainer" containerID="534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.336797 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91d28474-f268-4ecf-96b7-5a5007e715c3-kube-api-access-chxlk" (OuterVolumeSpecName: "kube-api-access-chxlk") pod "91d28474-f268-4ecf-96b7-5a5007e715c3" (UID: "91d28474-f268-4ecf-96b7-5a5007e715c3"). InnerVolumeSpecName "kube-api-access-chxlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.356090 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91d28474-f268-4ecf-96b7-5a5007e715c3" (UID: "91d28474-f268-4ecf-96b7-5a5007e715c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.371874 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-config-data" (OuterVolumeSpecName: "config-data") pod "91d28474-f268-4ecf-96b7-5a5007e715c3" (UID: "91d28474-f268-4ecf-96b7-5a5007e715c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.423832 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.423866 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.423879 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chxlk\" (UniqueName: \"kubernetes.io/projected/91d28474-f268-4ecf-96b7-5a5007e715c3-kube-api-access-chxlk\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.423891 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d28474-f268-4ecf-96b7-5a5007e715c3-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.466937 4898 scope.go:117] "RemoveContainer" containerID="1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d" Mar 13 14:23:54 crc kubenswrapper[4898]: E0313 14:23:54.467411 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d\": container with ID starting with 1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d not found: ID does not exist" containerID="1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.467449 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d"} err="failed to get container status \"1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d\": rpc error: code = NotFound desc = could not find container \"1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d\": container with ID starting with 1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d not found: ID does not exist" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.467472 4898 scope.go:117] "RemoveContainer" containerID="534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4" Mar 13 14:23:54 crc kubenswrapper[4898]: E0313 14:23:54.467825 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4\": container with ID starting with 534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4 not found: ID does not exist" containerID="534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.467874 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4"} err="failed to get container status \"534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4\": rpc error: code = NotFound desc = could not find container \"534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4\": container with ID starting with 534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4 not found: ID does not exist" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.518272 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.558127 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.649100 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.678331 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.687027 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:54 crc kubenswrapper[4898]: E0313 14:23:54.687576 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-api" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.687613 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-api" Mar 13 14:23:54 crc kubenswrapper[4898]: E0313 14:23:54.687645 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ed6478d-e1a3-4587-813f-222e6c4e54d7" containerName="aodh-db-sync" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.687651 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed6478d-e1a3-4587-813f-222e6c4e54d7" containerName="aodh-db-sync" Mar 13 14:23:54 crc kubenswrapper[4898]: E0313 14:23:54.687661 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-log" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.687669 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-log" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.688042 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-log" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.688078 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-api" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.688092 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ed6478d-e1a3-4587-813f-222e6c4e54d7" containerName="aodh-db-sync" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.689452 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.694066 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.694220 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.697409 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.697621 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.737608 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.737707 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.737734 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thsgw\" (UniqueName: \"kubernetes.io/projected/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-kube-api-access-thsgw\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.737784 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-logs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.737822 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-config-data\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.738114 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-public-tls-certs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.840723 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.840831 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.840854 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thsgw\" (UniqueName: \"kubernetes.io/projected/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-kube-api-access-thsgw\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.840965 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-logs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.841004 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-config-data\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.841069 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-public-tls-certs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.841751 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-logs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.846835 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-config-data\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.847664 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.851133 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.853351 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-public-tls-certs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.864115 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thsgw\" (UniqueName: \"kubernetes.io/projected/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-kube-api-access-thsgw\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.014404 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.312054 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerStarted","Data":"ebded2a2ffeeed33539cde9ae30e2e99c622582f6a03504795768afaee07448b"} Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.331235 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.572028 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.615857 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-tqzkv"] Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.627540 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.629779 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.630618 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.656198 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-tqzkv"] Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.668022 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phtst\" (UniqueName: \"kubernetes.io/projected/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-kube-api-access-phtst\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.668391 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-config-data\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.668558 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-scripts\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.668655 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.761189 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" path="/var/lib/kubelet/pods/91d28474-f268-4ecf-96b7-5a5007e715c3/volumes" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.770648 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-config-data\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.770758 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-scripts\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.770791 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.770979 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phtst\" (UniqueName: \"kubernetes.io/projected/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-kube-api-access-phtst\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.780015 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.783662 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-config-data\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.783748 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-scripts\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.793540 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phtst\" (UniqueName: \"kubernetes.io/projected/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-kube-api-access-phtst\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.953516 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:56 crc kubenswrapper[4898]: I0313 14:23:56.328469 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee","Type":"ContainerStarted","Data":"7f380ed1e03850f9a5390c93de71ece12783543d722324b0cebc7392d71d1cf4"} Mar 13 14:23:56 crc kubenswrapper[4898]: I0313 14:23:56.328701 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee","Type":"ContainerStarted","Data":"61b5d14b4c28a00305c933cc1ba3f69720704610d85741eff661575a32229c94"} Mar 13 14:23:56 crc kubenswrapper[4898]: I0313 14:23:56.328712 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee","Type":"ContainerStarted","Data":"e928948e8aff5ca3e9c4f8ba788b16341b26756ed4665c742a244420ba53b0dc"} Mar 13 14:23:56 crc kubenswrapper[4898]: I0313 14:23:56.359107 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.359089339 podStartE2EDuration="2.359089339s" podCreationTimestamp="2026-03-13 14:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:56.353550155 +0000 UTC m=+1671.355138394" watchObservedRunningTime="2026-03-13 14:23:56.359089339 +0000 UTC m=+1671.360677578" Mar 13 14:23:56 crc kubenswrapper[4898]: W0313 14:23:56.482338 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbb3eb7a_2c0d_42d3_9d61_b3ae21863f53.slice/crio-0634c5ce6c0d169237c2c0e4d4768a8bd12dada932f968be1cb2685e61f49409 WatchSource:0}: Error finding container 0634c5ce6c0d169237c2c0e4d4768a8bd12dada932f968be1cb2685e61f49409: Status 404 returned error can't find the container with id 0634c5ce6c0d169237c2c0e4d4768a8bd12dada932f968be1cb2685e61f49409 Mar 13 14:23:56 crc kubenswrapper[4898]: I0313 14:23:56.489130 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-tqzkv"] Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.362934 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerStarted","Data":"b50ed3e3e658824fabe90b9ab722bb9a4f5508339b4f447ce2f6f6c4d2c6d70b"} Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.363482 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="ceilometer-central-agent" containerID="cri-o://8580340e4db9b90d9c91bc7b25a0fe1542d88b2764af7feeb78e17278f5ad813" gracePeriod=30 Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.363647 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.363782 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="proxy-httpd" containerID="cri-o://b50ed3e3e658824fabe90b9ab722bb9a4f5508339b4f447ce2f6f6c4d2c6d70b" gracePeriod=30 Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.363853 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="sg-core" containerID="cri-o://ebded2a2ffeeed33539cde9ae30e2e99c622582f6a03504795768afaee07448b" gracePeriod=30 Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.363921 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="ceilometer-notification-agent" containerID="cri-o://d3f13a9a6e980be00fa109701dade3412bbacdebaa6ebaabd90e68e043d3f2de" gracePeriod=30 Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.367198 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.370871 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.374514 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-tnpwg" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.374849 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.378309 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.383453 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tqzkv" event={"ID":"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53","Type":"ContainerStarted","Data":"390b3ed23857cd84f527a8c8b365a228b6c0b2caebb2767f64baba810ca56690"} Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.384676 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tqzkv" event={"ID":"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53","Type":"ContainerStarted","Data":"0634c5ce6c0d169237c2c0e4d4768a8bd12dada932f968be1cb2685e61f49409"} Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.413751 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.415742 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-scripts\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.415838 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.415873 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-config-data\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.416213 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw6dd\" (UniqueName: \"kubernetes.io/projected/3a036241-2013-494e-8c1f-7584e9af2bf4-kube-api-access-sw6dd\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.432209 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.603656377 podStartE2EDuration="6.432184025s" podCreationTimestamp="2026-03-13 14:23:51 +0000 UTC" firstStartedPulling="2026-03-13 14:23:52.251113657 +0000 UTC m=+1667.252701916" lastFinishedPulling="2026-03-13 14:23:56.079641325 +0000 UTC m=+1671.081229564" observedRunningTime="2026-03-13 14:23:57.407281759 +0000 UTC m=+1672.408869998" watchObservedRunningTime="2026-03-13 14:23:57.432184025 +0000 UTC m=+1672.433772264" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.507290 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-tqzkv" podStartSLOduration=2.507248014 podStartE2EDuration="2.507248014s" podCreationTimestamp="2026-03-13 14:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:57.482110511 +0000 UTC m=+1672.483698760" watchObservedRunningTime="2026-03-13 14:23:57.507248014 +0000 UTC m=+1672.508836263" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.518674 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw6dd\" (UniqueName: \"kubernetes.io/projected/3a036241-2013-494e-8c1f-7584e9af2bf4-kube-api-access-sw6dd\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.520454 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-scripts\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.520586 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.520669 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-config-data\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.527937 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-config-data\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.531404 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-scripts\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.532341 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.541947 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw6dd\" (UniqueName: \"kubernetes.io/projected/3a036241-2013-494e-8c1f-7584e9af2bf4-kube-api-access-sw6dd\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.647089 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.720951 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.832719 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4rc67"] Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.832977 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" podUID="ee82d4ec-b565-40b8-b878-2574487d7e9d" containerName="dnsmasq-dns" containerID="cri-o://7d83563b664dd524f060763bd5deadd7009b47fedbc88f53844517f4c00a64ea" gracePeriod=10 Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.398218 4898 generic.go:334] "Generic (PLEG): container finished" podID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerID="b50ed3e3e658824fabe90b9ab722bb9a4f5508339b4f447ce2f6f6c4d2c6d70b" exitCode=0 Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.398698 4898 generic.go:334] "Generic (PLEG): container finished" podID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerID="ebded2a2ffeeed33539cde9ae30e2e99c622582f6a03504795768afaee07448b" exitCode=2 Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.398708 4898 generic.go:334] "Generic (PLEG): container finished" podID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerID="d3f13a9a6e980be00fa109701dade3412bbacdebaa6ebaabd90e68e043d3f2de" exitCode=0 Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.398710 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerDied","Data":"b50ed3e3e658824fabe90b9ab722bb9a4f5508339b4f447ce2f6f6c4d2c6d70b"} Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.398764 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerDied","Data":"ebded2a2ffeeed33539cde9ae30e2e99c622582f6a03504795768afaee07448b"} Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.398779 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerDied","Data":"d3f13a9a6e980be00fa109701dade3412bbacdebaa6ebaabd90e68e043d3f2de"} Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.401422 4898 generic.go:334] "Generic (PLEG): container finished" podID="ee82d4ec-b565-40b8-b878-2574487d7e9d" containerID="7d83563b664dd524f060763bd5deadd7009b47fedbc88f53844517f4c00a64ea" exitCode=0 Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.401969 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" event={"ID":"ee82d4ec-b565-40b8-b878-2574487d7e9d","Type":"ContainerDied","Data":"7d83563b664dd524f060763bd5deadd7009b47fedbc88f53844517f4c00a64ea"} Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.401997 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" event={"ID":"ee82d4ec-b565-40b8-b878-2574487d7e9d","Type":"ContainerDied","Data":"7a6f36f785b386fbe1a7acf818da5c2bcd85dcd2bb3cbfac1d54dd77d4a14425"} Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.402011 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a6f36f785b386fbe1a7acf818da5c2bcd85dcd2bb3cbfac1d54dd77d4a14425" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.503948 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.570165 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.666704 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-config\") pod \"ee82d4ec-b565-40b8-b878-2574487d7e9d\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.666801 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqwkf\" (UniqueName: \"kubernetes.io/projected/ee82d4ec-b565-40b8-b878-2574487d7e9d-kube-api-access-rqwkf\") pod \"ee82d4ec-b565-40b8-b878-2574487d7e9d\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.667029 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-swift-storage-0\") pod \"ee82d4ec-b565-40b8-b878-2574487d7e9d\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.667058 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-nb\") pod \"ee82d4ec-b565-40b8-b878-2574487d7e9d\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.667219 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-svc\") pod \"ee82d4ec-b565-40b8-b878-2574487d7e9d\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.667256 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-sb\") pod \"ee82d4ec-b565-40b8-b878-2574487d7e9d\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.678942 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee82d4ec-b565-40b8-b878-2574487d7e9d-kube-api-access-rqwkf" (OuterVolumeSpecName: "kube-api-access-rqwkf") pod "ee82d4ec-b565-40b8-b878-2574487d7e9d" (UID: "ee82d4ec-b565-40b8-b878-2574487d7e9d"). InnerVolumeSpecName "kube-api-access-rqwkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.741747 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:23:58 crc kubenswrapper[4898]: E0313 14:23:58.742065 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.748003 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee82d4ec-b565-40b8-b878-2574487d7e9d" (UID: "ee82d4ec-b565-40b8-b878-2574487d7e9d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.756196 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee82d4ec-b565-40b8-b878-2574487d7e9d" (UID: "ee82d4ec-b565-40b8-b878-2574487d7e9d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.770358 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee82d4ec-b565-40b8-b878-2574487d7e9d" (UID: "ee82d4ec-b565-40b8-b878-2574487d7e9d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.775216 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqwkf\" (UniqueName: \"kubernetes.io/projected/ee82d4ec-b565-40b8-b878-2574487d7e9d-kube-api-access-rqwkf\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.775247 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.775257 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.775266 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.792940 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-config" (OuterVolumeSpecName: "config") pod "ee82d4ec-b565-40b8-b878-2574487d7e9d" (UID: "ee82d4ec-b565-40b8-b878-2574487d7e9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.835475 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ee82d4ec-b565-40b8-b878-2574487d7e9d" (UID: "ee82d4ec-b565-40b8-b878-2574487d7e9d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.878513 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.878560 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:59 crc kubenswrapper[4898]: I0313 14:23:59.413028 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:59 crc kubenswrapper[4898]: I0313 14:23:59.413035 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerStarted","Data":"1e2c142eba973e7412047a391872c9d25e5735c5b05576796ed64cb74c786bb5"} Mar 13 14:23:59 crc kubenswrapper[4898]: I0313 14:23:59.413482 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerStarted","Data":"ded3b65c989e6a6e858ee713ff395a11604658cb153b63189d68172abd0b0293"} Mar 13 14:23:59 crc kubenswrapper[4898]: I0313 14:23:59.459149 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4rc67"] Mar 13 14:23:59 crc kubenswrapper[4898]: I0313 14:23:59.478531 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4rc67"] Mar 13 14:23:59 crc kubenswrapper[4898]: I0313 14:23:59.754390 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee82d4ec-b565-40b8-b878-2574487d7e9d" path="/var/lib/kubelet/pods/ee82d4ec-b565-40b8-b878-2574487d7e9d/volumes" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.134987 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556864-bvpnd"] Mar 13 14:24:00 crc kubenswrapper[4898]: E0313 14:24:00.136454 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee82d4ec-b565-40b8-b878-2574487d7e9d" containerName="init" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.136482 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee82d4ec-b565-40b8-b878-2574487d7e9d" containerName="init" Mar 13 14:24:00 crc kubenswrapper[4898]: E0313 14:24:00.136514 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee82d4ec-b565-40b8-b878-2574487d7e9d" containerName="dnsmasq-dns" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.136522 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee82d4ec-b565-40b8-b878-2574487d7e9d" containerName="dnsmasq-dns" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.136811 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee82d4ec-b565-40b8-b878-2574487d7e9d" containerName="dnsmasq-dns" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.137724 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.142161 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.142195 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.142259 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.154691 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556864-bvpnd"] Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.208311 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59tc9\" (UniqueName: \"kubernetes.io/projected/4e5f381c-bbd8-40d9-8c76-efee5fb7023a-kube-api-access-59tc9\") pod \"auto-csr-approver-29556864-bvpnd\" (UID: \"4e5f381c-bbd8-40d9-8c76-efee5fb7023a\") " pod="openshift-infra/auto-csr-approver-29556864-bvpnd" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.309879 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59tc9\" (UniqueName: \"kubernetes.io/projected/4e5f381c-bbd8-40d9-8c76-efee5fb7023a-kube-api-access-59tc9\") pod \"auto-csr-approver-29556864-bvpnd\" (UID: \"4e5f381c-bbd8-40d9-8c76-efee5fb7023a\") " pod="openshift-infra/auto-csr-approver-29556864-bvpnd" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.345203 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59tc9\" (UniqueName: \"kubernetes.io/projected/4e5f381c-bbd8-40d9-8c76-efee5fb7023a-kube-api-access-59tc9\") pod \"auto-csr-approver-29556864-bvpnd\" (UID: \"4e5f381c-bbd8-40d9-8c76-efee5fb7023a\") " pod="openshift-infra/auto-csr-approver-29556864-bvpnd" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.426276 4898 generic.go:334] "Generic (PLEG): container finished" podID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerID="8580340e4db9b90d9c91bc7b25a0fe1542d88b2764af7feeb78e17278f5ad813" exitCode=0 Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.426312 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerDied","Data":"8580340e4db9b90d9c91bc7b25a0fe1542d88b2764af7feeb78e17278f5ad813"} Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.473641 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.093986 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.230971 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-combined-ca-bundle\") pod \"15a748d6-879c-48aa-99d0-b4a02dcfb640\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.231063 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzkqk\" (UniqueName: \"kubernetes.io/projected/15a748d6-879c-48aa-99d0-b4a02dcfb640-kube-api-access-xzkqk\") pod \"15a748d6-879c-48aa-99d0-b4a02dcfb640\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.231103 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-sg-core-conf-yaml\") pod \"15a748d6-879c-48aa-99d0-b4a02dcfb640\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.231844 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-config-data\") pod \"15a748d6-879c-48aa-99d0-b4a02dcfb640\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.232016 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-scripts\") pod \"15a748d6-879c-48aa-99d0-b4a02dcfb640\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.232253 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-run-httpd\") pod \"15a748d6-879c-48aa-99d0-b4a02dcfb640\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.232421 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-log-httpd\") pod \"15a748d6-879c-48aa-99d0-b4a02dcfb640\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.233700 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "15a748d6-879c-48aa-99d0-b4a02dcfb640" (UID: "15a748d6-879c-48aa-99d0-b4a02dcfb640"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.235138 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "15a748d6-879c-48aa-99d0-b4a02dcfb640" (UID: "15a748d6-879c-48aa-99d0-b4a02dcfb640"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.235292 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.235313 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.245557 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-scripts" (OuterVolumeSpecName: "scripts") pod "15a748d6-879c-48aa-99d0-b4a02dcfb640" (UID: "15a748d6-879c-48aa-99d0-b4a02dcfb640"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.245753 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a748d6-879c-48aa-99d0-b4a02dcfb640-kube-api-access-xzkqk" (OuterVolumeSpecName: "kube-api-access-xzkqk") pod "15a748d6-879c-48aa-99d0-b4a02dcfb640" (UID: "15a748d6-879c-48aa-99d0-b4a02dcfb640"). InnerVolumeSpecName "kube-api-access-xzkqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.294132 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "15a748d6-879c-48aa-99d0-b4a02dcfb640" (UID: "15a748d6-879c-48aa-99d0-b4a02dcfb640"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.303603 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556864-bvpnd"] Mar 13 14:24:01 crc kubenswrapper[4898]: W0313 14:24:01.304361 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e5f381c_bbd8_40d9_8c76_efee5fb7023a.slice/crio-568aace14eba72f5295e81bc2141ff4ab6bf2d47b3914a6e46161ad7ea2b9751 WatchSource:0}: Error finding container 568aace14eba72f5295e81bc2141ff4ab6bf2d47b3914a6e46161ad7ea2b9751: Status 404 returned error can't find the container with id 568aace14eba72f5295e81bc2141ff4ab6bf2d47b3914a6e46161ad7ea2b9751 Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.337920 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzkqk\" (UniqueName: \"kubernetes.io/projected/15a748d6-879c-48aa-99d0-b4a02dcfb640-kube-api-access-xzkqk\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.337952 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.337960 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.349999 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15a748d6-879c-48aa-99d0-b4a02dcfb640" (UID: "15a748d6-879c-48aa-99d0-b4a02dcfb640"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.409412 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-config-data" (OuterVolumeSpecName: "config-data") pod "15a748d6-879c-48aa-99d0-b4a02dcfb640" (UID: "15a748d6-879c-48aa-99d0-b4a02dcfb640"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.411115 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-plhgx"] Mar 13 14:24:01 crc kubenswrapper[4898]: E0313 14:24:01.411719 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="sg-core" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.411742 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="sg-core" Mar 13 14:24:01 crc kubenswrapper[4898]: E0313 14:24:01.411761 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="proxy-httpd" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.411768 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="proxy-httpd" Mar 13 14:24:01 crc kubenswrapper[4898]: E0313 14:24:01.411804 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="ceilometer-notification-agent" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.411810 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="ceilometer-notification-agent" Mar 13 14:24:01 crc kubenswrapper[4898]: E0313 14:24:01.411822 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="ceilometer-central-agent" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.411828 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="ceilometer-central-agent" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.417420 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="proxy-httpd" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.417454 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="ceilometer-central-agent" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.417468 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="sg-core" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.417493 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="ceilometer-notification-agent" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.419592 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.423765 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-plhgx"] Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.444310 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.444347 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.478882 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerStarted","Data":"9263a833a783d5b9000e728c8a69913160176a9e9c946c16d7fa08425ffcc556"} Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.495552 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerDied","Data":"3c338bdcaa4d57b4416d9ffc2b822eddf6cb4b810f647a605061d6a63e5367e7"} Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.495616 4898 scope.go:117] "RemoveContainer" containerID="b50ed3e3e658824fabe90b9ab722bb9a4f5508339b4f447ce2f6f6c4d2c6d70b" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.495820 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.506423 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" event={"ID":"4e5f381c-bbd8-40d9-8c76-efee5fb7023a","Type":"ContainerStarted","Data":"568aace14eba72f5295e81bc2141ff4ab6bf2d47b3914a6e46161ad7ea2b9751"} Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.528165 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.543091 4898 scope.go:117] "RemoveContainer" containerID="ebded2a2ffeeed33539cde9ae30e2e99c622582f6a03504795768afaee07448b" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.546449 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6945l\" (UniqueName: \"kubernetes.io/projected/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-kube-api-access-6945l\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.546668 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-utilities\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.546723 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-catalog-content\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.569131 4898 scope.go:117] "RemoveContainer" containerID="d3f13a9a6e980be00fa109701dade3412bbacdebaa6ebaabd90e68e043d3f2de" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.579965 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.591954 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.604165 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.606518 4898 scope.go:117] "RemoveContainer" containerID="8580340e4db9b90d9c91bc7b25a0fe1542d88b2764af7feeb78e17278f5ad813" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.608117 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.611726 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.612117 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.616324 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.648393 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-utilities\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.648463 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-catalog-content\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.648551 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6945l\" (UniqueName: \"kubernetes.io/projected/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-kube-api-access-6945l\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.649425 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-utilities\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.649488 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-catalog-content\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.704767 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6945l\" (UniqueName: \"kubernetes.io/projected/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-kube-api-access-6945l\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.750176 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8cbv\" (UniqueName: \"kubernetes.io/projected/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-kube-api-access-r8cbv\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.750369 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-config-data\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.750439 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-scripts\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.750483 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-log-httpd\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.750646 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-run-httpd\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.751102 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.751285 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.762978 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" path="/var/lib/kubelet/pods/15a748d6-879c-48aa-99d0-b4a02dcfb640/volumes" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.798916 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.852879 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.852953 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.853016 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8cbv\" (UniqueName: \"kubernetes.io/projected/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-kube-api-access-r8cbv\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.853078 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-config-data\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.853112 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-scripts\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.853138 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-log-httpd\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.853188 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-run-httpd\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.854155 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-run-httpd\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.854173 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-log-httpd\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.857616 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.859686 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-config-data\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.860790 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.865100 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-scripts\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.874205 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8cbv\" (UniqueName: \"kubernetes.io/projected/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-kube-api-access-r8cbv\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.929604 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:02 crc kubenswrapper[4898]: I0313 14:24:02.381306 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-plhgx"] Mar 13 14:24:02 crc kubenswrapper[4898]: I0313 14:24:02.484873 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:02 crc kubenswrapper[4898]: W0313 14:24:02.642035 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ef06426_d2da_4ad2_8168_1ca91c9ca2a7.slice/crio-14d2db4ac83fec75f96ff83117a0a7f835b4698b6e0a2c106a080e228485900f WatchSource:0}: Error finding container 14d2db4ac83fec75f96ff83117a0a7f835b4698b6e0a2c106a080e228485900f: Status 404 returned error can't find the container with id 14d2db4ac83fec75f96ff83117a0a7f835b4698b6e0a2c106a080e228485900f Mar 13 14:24:02 crc kubenswrapper[4898]: W0313 14:24:02.642574 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc75348dc_b6ff_43ff_bd9a_d84c91f23ea8.slice/crio-357efb5d307770e6ac4560f42dea2e3bc32a6f8f90350563a876139972493b91 WatchSource:0}: Error finding container 357efb5d307770e6ac4560f42dea2e3bc32a6f8f90350563a876139972493b91: Status 404 returned error can't find the container with id 357efb5d307770e6ac4560f42dea2e3bc32a6f8f90350563a876139972493b91 Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.536757 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerStarted","Data":"e1faed09b83d1750f543cd522c4a6bcbb14737623c5d254b367266c190bdbe2c"} Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.539866 4898 generic.go:334] "Generic (PLEG): container finished" podID="bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" containerID="390b3ed23857cd84f527a8c8b365a228b6c0b2caebb2767f64baba810ca56690" exitCode=0 Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.539934 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tqzkv" event={"ID":"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53","Type":"ContainerDied","Data":"390b3ed23857cd84f527a8c8b365a228b6c0b2caebb2767f64baba810ca56690"} Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.546168 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerStarted","Data":"66438918ebccde6ac554af32fd8660905c8d96c674c33aa3ef4ebeb984883811"} Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.546217 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerStarted","Data":"14d2db4ac83fec75f96ff83117a0a7f835b4698b6e0a2c106a080e228485900f"} Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.551107 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" event={"ID":"4e5f381c-bbd8-40d9-8c76-efee5fb7023a","Type":"ContainerStarted","Data":"4839c26bbb3360becfb71db51ead56738498d1508d530ccce036f032e975f9b4"} Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.556929 4898 generic.go:334] "Generic (PLEG): container finished" podID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerID="327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c" exitCode=0 Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.556972 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plhgx" event={"ID":"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8","Type":"ContainerDied","Data":"327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c"} Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.557018 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plhgx" event={"ID":"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8","Type":"ContainerStarted","Data":"357efb5d307770e6ac4560f42dea2e3bc32a6f8f90350563a876139972493b91"} Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.613674 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" podStartSLOduration=2.157100702 podStartE2EDuration="3.613651934s" podCreationTimestamp="2026-03-13 14:24:00 +0000 UTC" firstStartedPulling="2026-03-13 14:24:01.307448645 +0000 UTC m=+1676.309036874" lastFinishedPulling="2026-03-13 14:24:02.763999877 +0000 UTC m=+1677.765588106" observedRunningTime="2026-03-13 14:24:03.605000489 +0000 UTC m=+1678.606588748" watchObservedRunningTime="2026-03-13 14:24:03.613651934 +0000 UTC m=+1678.615240173" Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.702641 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:04 crc kubenswrapper[4898]: I0313 14:24:04.579074 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerStarted","Data":"7cfa2af5e418e74781b5abeafa5d792bceacbddbc2bc25e1e25fd94e21acc493"} Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.015524 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.015993 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.222517 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.277793 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phtst\" (UniqueName: \"kubernetes.io/projected/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-kube-api-access-phtst\") pod \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.278018 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-combined-ca-bundle\") pod \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.278074 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-config-data\") pod \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.278095 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-scripts\") pod \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.284549 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-scripts" (OuterVolumeSpecName: "scripts") pod "bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" (UID: "bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.308047 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-kube-api-access-phtst" (OuterVolumeSpecName: "kube-api-access-phtst") pod "bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" (UID: "bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53"). InnerVolumeSpecName "kube-api-access-phtst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.378180 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" (UID: "bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.378231 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-config-data" (OuterVolumeSpecName: "config-data") pod "bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" (UID: "bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.380995 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phtst\" (UniqueName: \"kubernetes.io/projected/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-kube-api-access-phtst\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.381021 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.381033 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.381042 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.594992 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tqzkv" event={"ID":"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53","Type":"ContainerDied","Data":"0634c5ce6c0d169237c2c0e4d4768a8bd12dada932f968be1cb2685e61f49409"} Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.595310 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0634c5ce6c0d169237c2c0e4d4768a8bd12dada932f968be1cb2685e61f49409" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.595167 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.607037 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerStarted","Data":"1fd92eb6403fb0dbf337c5f7f0d057b61a6f0d6c15c82bc4970d631ddca294d6"} Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.612250 4898 generic.go:334] "Generic (PLEG): container finished" podID="4e5f381c-bbd8-40d9-8c76-efee5fb7023a" containerID="4839c26bbb3360becfb71db51ead56738498d1508d530ccce036f032e975f9b4" exitCode=0 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.612364 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" event={"ID":"4e5f381c-bbd8-40d9-8c76-efee5fb7023a","Type":"ContainerDied","Data":"4839c26bbb3360becfb71db51ead56738498d1508d530ccce036f032e975f9b4"} Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.617273 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plhgx" event={"ID":"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8","Type":"ContainerStarted","Data":"b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca"} Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.621117 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerStarted","Data":"e2cbd875c5a5e2c192552e4d11fe9c7d29be7ff5b49fb690e14597f7135e2adb"} Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.621417 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-api" containerID="cri-o://1e2c142eba973e7412047a391872c9d25e5735c5b05576796ed64cb74c786bb5" gracePeriod=30 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.621592 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-listener" containerID="cri-o://e2cbd875c5a5e2c192552e4d11fe9c7d29be7ff5b49fb690e14597f7135e2adb" gracePeriod=30 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.621735 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-notifier" containerID="cri-o://e1faed09b83d1750f543cd522c4a6bcbb14737623c5d254b367266c190bdbe2c" gracePeriod=30 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.622040 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-evaluator" containerID="cri-o://9263a833a783d5b9000e728c8a69913160176a9e9c946c16d7fa08425ffcc556" gracePeriod=30 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.683977 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.2397751599999998 podStartE2EDuration="8.683954037s" podCreationTimestamp="2026-03-13 14:23:57 +0000 UTC" firstStartedPulling="2026-03-13 14:23:58.493416325 +0000 UTC m=+1673.495004564" lastFinishedPulling="2026-03-13 14:24:04.937595202 +0000 UTC m=+1679.939183441" observedRunningTime="2026-03-13 14:24:05.671259878 +0000 UTC m=+1680.672848137" watchObservedRunningTime="2026-03-13 14:24:05.683954037 +0000 UTC m=+1680.685542276" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.875550 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.875773 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-log" containerID="cri-o://61b5d14b4c28a00305c933cc1ba3f69720704610d85741eff661575a32229c94" gracePeriod=30 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.876244 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-api" containerID="cri-o://7f380ed1e03850f9a5390c93de71ece12783543d722324b0cebc7392d71d1cf4" gracePeriod=30 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.907359 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.907795 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9cfb3db3-7d46-4ab5-aecc-00ddd738d359" containerName="nova-scheduler-scheduler" containerID="cri-o://996fc46edbed8602f9eda3a09dc63ac36038779496a63c3ff3ba77a3b3a9e5b0" gracePeriod=30 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.961971 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.962599 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-log" containerID="cri-o://c0fcc6916c9c7951ac6f57b54e64b861fe8be03a65443f0a0008c4f458405d78" gracePeriod=30 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.963250 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-metadata" containerID="cri-o://9bd9f3f02e15571b11f72778527812906d168be88196cc4314aa88f5c276ac6c" gracePeriod=30 Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.046107 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.12:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.046413 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.12:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.632983 4898 generic.go:334] "Generic (PLEG): container finished" podID="a8db736b-00b7-4251-a667-3b2138c6c928" containerID="c0fcc6916c9c7951ac6f57b54e64b861fe8be03a65443f0a0008c4f458405d78" exitCode=143 Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.633065 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8db736b-00b7-4251-a667-3b2138c6c928","Type":"ContainerDied","Data":"c0fcc6916c9c7951ac6f57b54e64b861fe8be03a65443f0a0008c4f458405d78"} Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.635609 4898 generic.go:334] "Generic (PLEG): container finished" podID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerID="e1faed09b83d1750f543cd522c4a6bcbb14737623c5d254b367266c190bdbe2c" exitCode=0 Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.635630 4898 generic.go:334] "Generic (PLEG): container finished" podID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerID="9263a833a783d5b9000e728c8a69913160176a9e9c946c16d7fa08425ffcc556" exitCode=0 Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.635642 4898 generic.go:334] "Generic (PLEG): container finished" podID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerID="1e2c142eba973e7412047a391872c9d25e5735c5b05576796ed64cb74c786bb5" exitCode=0 Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.635666 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerDied","Data":"e1faed09b83d1750f543cd522c4a6bcbb14737623c5d254b367266c190bdbe2c"} Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.635735 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerDied","Data":"9263a833a783d5b9000e728c8a69913160176a9e9c946c16d7fa08425ffcc556"} Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.635750 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerDied","Data":"1e2c142eba973e7412047a391872c9d25e5735c5b05576796ed64cb74c786bb5"} Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.640342 4898 generic.go:334] "Generic (PLEG): container finished" podID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerID="61b5d14b4c28a00305c933cc1ba3f69720704610d85741eff661575a32229c94" exitCode=143 Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.640638 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee","Type":"ContainerDied","Data":"61b5d14b4c28a00305c933cc1ba3f69720704610d85741eff661575a32229c94"} Mar 13 14:24:07 crc kubenswrapper[4898]: I0313 14:24:07.233159 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" Mar 13 14:24:07 crc kubenswrapper[4898]: E0313 14:24:07.334106 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="996fc46edbed8602f9eda3a09dc63ac36038779496a63c3ff3ba77a3b3a9e5b0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 14:24:07 crc kubenswrapper[4898]: E0313 14:24:07.335750 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="996fc46edbed8602f9eda3a09dc63ac36038779496a63c3ff3ba77a3b3a9e5b0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 14:24:07 crc kubenswrapper[4898]: I0313 14:24:07.336613 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59tc9\" (UniqueName: \"kubernetes.io/projected/4e5f381c-bbd8-40d9-8c76-efee5fb7023a-kube-api-access-59tc9\") pod \"4e5f381c-bbd8-40d9-8c76-efee5fb7023a\" (UID: \"4e5f381c-bbd8-40d9-8c76-efee5fb7023a\") " Mar 13 14:24:07 crc kubenswrapper[4898]: E0313 14:24:07.339118 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="996fc46edbed8602f9eda3a09dc63ac36038779496a63c3ff3ba77a3b3a9e5b0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 14:24:07 crc kubenswrapper[4898]: E0313 14:24:07.339159 4898 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="9cfb3db3-7d46-4ab5-aecc-00ddd738d359" containerName="nova-scheduler-scheduler" Mar 13 14:24:07 crc kubenswrapper[4898]: I0313 14:24:07.341019 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5f381c-bbd8-40d9-8c76-efee5fb7023a-kube-api-access-59tc9" (OuterVolumeSpecName: "kube-api-access-59tc9") pod "4e5f381c-bbd8-40d9-8c76-efee5fb7023a" (UID: "4e5f381c-bbd8-40d9-8c76-efee5fb7023a"). InnerVolumeSpecName "kube-api-access-59tc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:07 crc kubenswrapper[4898]: I0313 14:24:07.439664 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59tc9\" (UniqueName: \"kubernetes.io/projected/4e5f381c-bbd8-40d9-8c76-efee5fb7023a-kube-api-access-59tc9\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:07 crc kubenswrapper[4898]: I0313 14:24:07.657139 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" event={"ID":"4e5f381c-bbd8-40d9-8c76-efee5fb7023a","Type":"ContainerDied","Data":"568aace14eba72f5295e81bc2141ff4ab6bf2d47b3914a6e46161ad7ea2b9751"} Mar 13 14:24:07 crc kubenswrapper[4898]: I0313 14:24:07.657506 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="568aace14eba72f5295e81bc2141ff4ab6bf2d47b3914a6e46161ad7ea2b9751" Mar 13 14:24:07 crc kubenswrapper[4898]: I0313 14:24:07.657388 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" Mar 13 14:24:07 crc kubenswrapper[4898]: I0313 14:24:07.660633 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerStarted","Data":"795e7834e563381b54df39b3e37495b74bd94fd75981de1568268bd084ae046d"} Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.242684 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556858-vdbkv"] Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.257144 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556858-vdbkv"] Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.682156 4898 generic.go:334] "Generic (PLEG): container finished" podID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerID="b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca" exitCode=0 Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.682373 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="ceilometer-central-agent" containerID="cri-o://66438918ebccde6ac554af32fd8660905c8d96c674c33aa3ef4ebeb984883811" gracePeriod=30 Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.683010 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plhgx" event={"ID":"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8","Type":"ContainerDied","Data":"b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca"} Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.683143 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="proxy-httpd" containerID="cri-o://795e7834e563381b54df39b3e37495b74bd94fd75981de1568268bd084ae046d" gracePeriod=30 Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.683327 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="ceilometer-notification-agent" containerID="cri-o://7cfa2af5e418e74781b5abeafa5d792bceacbddbc2bc25e1e25fd94e21acc493" gracePeriod=30 Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.683673 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.683760 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="sg-core" containerID="cri-o://1fd92eb6403fb0dbf337c5f7f0d057b61a6f0d6c15c82bc4970d631ddca294d6" gracePeriod=30 Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.745708 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.386722591 podStartE2EDuration="7.745681948s" podCreationTimestamp="2026-03-13 14:24:01 +0000 UTC" firstStartedPulling="2026-03-13 14:24:02.645048969 +0000 UTC m=+1677.646637218" lastFinishedPulling="2026-03-13 14:24:07.004008336 +0000 UTC m=+1682.005596575" observedRunningTime="2026-03-13 14:24:08.731218073 +0000 UTC m=+1683.732806322" watchObservedRunningTime="2026-03-13 14:24:08.745681948 +0000 UTC m=+1683.747270207" Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.699189 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerID="795e7834e563381b54df39b3e37495b74bd94fd75981de1568268bd084ae046d" exitCode=0 Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.699455 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerID="1fd92eb6403fb0dbf337c5f7f0d057b61a6f0d6c15c82bc4970d631ddca294d6" exitCode=2 Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.699463 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerID="7cfa2af5e418e74781b5abeafa5d792bceacbddbc2bc25e1e25fd94e21acc493" exitCode=0 Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.699257 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerDied","Data":"795e7834e563381b54df39b3e37495b74bd94fd75981de1568268bd084ae046d"} Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.699530 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerDied","Data":"1fd92eb6403fb0dbf337c5f7f0d057b61a6f0d6c15c82bc4970d631ddca294d6"} Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.699544 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerDied","Data":"7cfa2af5e418e74781b5abeafa5d792bceacbddbc2bc25e1e25fd94e21acc493"} Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.703472 4898 generic.go:334] "Generic (PLEG): container finished" podID="a8db736b-00b7-4251-a667-3b2138c6c928" containerID="9bd9f3f02e15571b11f72778527812906d168be88196cc4314aa88f5c276ac6c" exitCode=0 Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.703516 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8db736b-00b7-4251-a667-3b2138c6c928","Type":"ContainerDied","Data":"9bd9f3f02e15571b11f72778527812906d168be88196cc4314aa88f5c276ac6c"} Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.758656 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b0610af-1f13-4f43-9249-8d50a0dcbc14" path="/var/lib/kubelet/pods/1b0610af-1f13-4f43-9249-8d50a0dcbc14/volumes" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.723493 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plhgx" event={"ID":"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8","Type":"ContainerStarted","Data":"201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1"} Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.728877 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8db736b-00b7-4251-a667-3b2138c6c928","Type":"ContainerDied","Data":"2f2f835653ef4ac86c3f5f419dc3f2bdfd6ae25a7d99a1468342cb1b296536ca"} Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.728968 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f2f835653ef4ac86c3f5f419dc3f2bdfd6ae25a7d99a1468342cb1b296536ca" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.757198 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-plhgx" podStartSLOduration=3.610090929 podStartE2EDuration="9.757148084s" podCreationTimestamp="2026-03-13 14:24:01 +0000 UTC" firstStartedPulling="2026-03-13 14:24:03.561206532 +0000 UTC m=+1678.562794771" lastFinishedPulling="2026-03-13 14:24:09.708263687 +0000 UTC m=+1684.709851926" observedRunningTime="2026-03-13 14:24:10.74709579 +0000 UTC m=+1685.748684049" watchObservedRunningTime="2026-03-13 14:24:10.757148084 +0000 UTC m=+1685.758736333" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.776967 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.832510 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg4w5\" (UniqueName: \"kubernetes.io/projected/a8db736b-00b7-4251-a667-3b2138c6c928-kube-api-access-tg4w5\") pod \"a8db736b-00b7-4251-a667-3b2138c6c928\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.832608 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-nova-metadata-tls-certs\") pod \"a8db736b-00b7-4251-a667-3b2138c6c928\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.832715 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-combined-ca-bundle\") pod \"a8db736b-00b7-4251-a667-3b2138c6c928\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.832834 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-config-data\") pod \"a8db736b-00b7-4251-a667-3b2138c6c928\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.832876 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8db736b-00b7-4251-a667-3b2138c6c928-logs\") pod \"a8db736b-00b7-4251-a667-3b2138c6c928\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.838693 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8db736b-00b7-4251-a667-3b2138c6c928-logs" (OuterVolumeSpecName: "logs") pod "a8db736b-00b7-4251-a667-3b2138c6c928" (UID: "a8db736b-00b7-4251-a667-3b2138c6c928"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.846759 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8db736b-00b7-4251-a667-3b2138c6c928-kube-api-access-tg4w5" (OuterVolumeSpecName: "kube-api-access-tg4w5") pod "a8db736b-00b7-4251-a667-3b2138c6c928" (UID: "a8db736b-00b7-4251-a667-3b2138c6c928"). InnerVolumeSpecName "kube-api-access-tg4w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.884708 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-config-data" (OuterVolumeSpecName: "config-data") pod "a8db736b-00b7-4251-a667-3b2138c6c928" (UID: "a8db736b-00b7-4251-a667-3b2138c6c928"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.887428 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8db736b-00b7-4251-a667-3b2138c6c928" (UID: "a8db736b-00b7-4251-a667-3b2138c6c928"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.917083 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a8db736b-00b7-4251-a667-3b2138c6c928" (UID: "a8db736b-00b7-4251-a667-3b2138c6c928"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.936113 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg4w5\" (UniqueName: \"kubernetes.io/projected/a8db736b-00b7-4251-a667-3b2138c6c928-kube-api-access-tg4w5\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.936146 4898 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.936156 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.936165 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.936175 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8db736b-00b7-4251-a667-3b2138c6c928-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.741956 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:24:11 crc kubenswrapper[4898]: E0313 14:24:11.745098 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.753764 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerID="66438918ebccde6ac554af32fd8660905c8d96c674c33aa3ef4ebeb984883811" exitCode=0 Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.755694 4898 generic.go:334] "Generic (PLEG): container finished" podID="9cfb3db3-7d46-4ab5-aecc-00ddd738d359" containerID="996fc46edbed8602f9eda3a09dc63ac36038779496a63c3ff3ba77a3b3a9e5b0" exitCode=0 Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.755760 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.756704 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerDied","Data":"66438918ebccde6ac554af32fd8660905c8d96c674c33aa3ef4ebeb984883811"} Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.756733 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9cfb3db3-7d46-4ab5-aecc-00ddd738d359","Type":"ContainerDied","Data":"996fc46edbed8602f9eda3a09dc63ac36038779496a63c3ff3ba77a3b3a9e5b0"} Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.802332 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.802398 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.817983 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.842428 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.857344 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:24:11 crc kubenswrapper[4898]: E0313 14:24:11.858597 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5f381c-bbd8-40d9-8c76-efee5fb7023a" containerName="oc" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.858616 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5f381c-bbd8-40d9-8c76-efee5fb7023a" containerName="oc" Mar 13 14:24:11 crc kubenswrapper[4898]: E0313 14:24:11.858637 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-metadata" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.858644 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-metadata" Mar 13 14:24:11 crc kubenswrapper[4898]: E0313 14:24:11.858657 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" containerName="nova-manage" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.858663 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" containerName="nova-manage" Mar 13 14:24:11 crc kubenswrapper[4898]: E0313 14:24:11.858680 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-log" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.858687 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-log" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.858918 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-log" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.858943 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" containerName="nova-manage" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.858960 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-metadata" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.858972 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e5f381c-bbd8-40d9-8c76-efee5fb7023a" containerName="oc" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.860432 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.865298 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.865436 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.887367 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.965526 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.965583 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-logs\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.965646 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-config-data\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.965693 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.965816 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dlfg\" (UniqueName: \"kubernetes.io/projected/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-kube-api-access-7dlfg\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.068305 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dlfg\" (UniqueName: \"kubernetes.io/projected/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-kube-api-access-7dlfg\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.068347 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.068385 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-logs\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.068438 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-config-data\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.068484 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.069198 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-logs\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.076384 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.078963 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-config-data\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.079622 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.083365 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dlfg\" (UniqueName: \"kubernetes.io/projected/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-kube-api-access-7dlfg\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.183856 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.195876 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.202300 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.283805 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-combined-ca-bundle\") pod \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.283879 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26jcq\" (UniqueName: \"kubernetes.io/projected/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-kube-api-access-26jcq\") pod \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.283956 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-log-httpd\") pod \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.284070 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-combined-ca-bundle\") pod \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.284090 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-scripts\") pod \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.284141 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-run-httpd\") pod \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.284174 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-sg-core-conf-yaml\") pod \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.284223 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-config-data\") pod \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.284298 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8cbv\" (UniqueName: \"kubernetes.io/projected/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-kube-api-access-r8cbv\") pod \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.284319 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-config-data\") pod \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.288151 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" (UID: "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.289520 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-scripts" (OuterVolumeSpecName: "scripts") pod "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" (UID: "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.298532 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-kube-api-access-26jcq" (OuterVolumeSpecName: "kube-api-access-26jcq") pod "9cfb3db3-7d46-4ab5-aecc-00ddd738d359" (UID: "9cfb3db3-7d46-4ab5-aecc-00ddd738d359"). InnerVolumeSpecName "kube-api-access-26jcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.298829 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" (UID: "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.308124 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-kube-api-access-r8cbv" (OuterVolumeSpecName: "kube-api-access-r8cbv") pod "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" (UID: "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7"). InnerVolumeSpecName "kube-api-access-r8cbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.327410 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" (UID: "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.327718 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cfb3db3-7d46-4ab5-aecc-00ddd738d359" (UID: "9cfb3db3-7d46-4ab5-aecc-00ddd738d359"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.362723 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-config-data" (OuterVolumeSpecName: "config-data") pod "9cfb3db3-7d46-4ab5-aecc-00ddd738d359" (UID: "9cfb3db3-7d46-4ab5-aecc-00ddd738d359"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.389353 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.389379 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26jcq\" (UniqueName: \"kubernetes.io/projected/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-kube-api-access-26jcq\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.389392 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.389400 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.389409 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.389416 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.389424 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.389432 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8cbv\" (UniqueName: \"kubernetes.io/projected/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-kube-api-access-r8cbv\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.424746 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" (UID: "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.460737 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-config-data" (OuterVolumeSpecName: "config-data") pod "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" (UID: "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.491876 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.492191 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.739868 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.790698 4898 generic.go:334] "Generic (PLEG): container finished" podID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerID="7f380ed1e03850f9a5390c93de71ece12783543d722324b0cebc7392d71d1cf4" exitCode=0 Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.790763 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee","Type":"ContainerDied","Data":"7f380ed1e03850f9a5390c93de71ece12783543d722324b0cebc7392d71d1cf4"} Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.795806 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.795937 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerDied","Data":"14d2db4ac83fec75f96ff83117a0a7f835b4698b6e0a2c106a080e228485900f"} Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.796007 4898 scope.go:117] "RemoveContainer" containerID="795e7834e563381b54df39b3e37495b74bd94fd75981de1568268bd084ae046d" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.802445 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.802481 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9cfb3db3-7d46-4ab5-aecc-00ddd738d359","Type":"ContainerDied","Data":"4fb43fc1071513c1a034ec9b0dda28c7b02d6ddc884f93f9664159fa9c0ff74c"} Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.807224 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17","Type":"ContainerStarted","Data":"6811861f93d3b2a6d535376e944114be9a28d87de96ba9faef32c89bda3c7c57"} Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.827557 4898 scope.go:117] "RemoveContainer" containerID="1fd92eb6403fb0dbf337c5f7f0d057b61a6f0d6c15c82bc4970d631ddca294d6" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.893197 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.907102 4898 scope.go:117] "RemoveContainer" containerID="7cfa2af5e418e74781b5abeafa5d792bceacbddbc2bc25e1e25fd94e21acc493" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.916844 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.934759 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.944009 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-plhgx" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="registry-server" probeResult="failure" output=< Mar 13 14:24:12 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:24:12 crc kubenswrapper[4898]: > Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.948386 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.968952 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:12 crc kubenswrapper[4898]: E0313 14:24:12.969445 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="ceilometer-notification-agent" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969465 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="ceilometer-notification-agent" Mar 13 14:24:12 crc kubenswrapper[4898]: E0313 14:24:12.969475 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="sg-core" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969482 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="sg-core" Mar 13 14:24:12 crc kubenswrapper[4898]: E0313 14:24:12.969515 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cfb3db3-7d46-4ab5-aecc-00ddd738d359" containerName="nova-scheduler-scheduler" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969521 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cfb3db3-7d46-4ab5-aecc-00ddd738d359" containerName="nova-scheduler-scheduler" Mar 13 14:24:12 crc kubenswrapper[4898]: E0313 14:24:12.969549 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="ceilometer-central-agent" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969556 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="ceilometer-central-agent" Mar 13 14:24:12 crc kubenswrapper[4898]: E0313 14:24:12.969568 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="proxy-httpd" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969574 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="proxy-httpd" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969806 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="ceilometer-central-agent" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969824 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="proxy-httpd" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969846 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="ceilometer-notification-agent" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969859 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cfb3db3-7d46-4ab5-aecc-00ddd738d359" containerName="nova-scheduler-scheduler" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969867 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="sg-core" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.977060 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.980161 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.980458 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.983187 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.985532 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.987161 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.991480 4898 scope.go:117] "RemoveContainer" containerID="66438918ebccde6ac554af32fd8660905c8d96c674c33aa3ef4ebeb984883811" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.006504 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-log-httpd\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.006569 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk9pt\" (UniqueName: \"kubernetes.io/projected/0d56db73-0e9e-47af-b0bd-77231fe40077-kube-api-access-tk9pt\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.006609 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-run-httpd\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.006656 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.006710 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.006784 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-config-data\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.006824 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-scripts\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.009619 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.015705 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.015758 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.036807 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.112749 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-log-httpd\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.112789 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk9pt\" (UniqueName: \"kubernetes.io/projected/0d56db73-0e9e-47af-b0bd-77231fe40077-kube-api-access-tk9pt\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.112821 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-run-httpd\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.112845 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.112882 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.112925 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97d388e1-b1b3-409d-b7c5-38b37734a8e6-config-data\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.112984 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-config-data\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.113007 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9whj8\" (UniqueName: \"kubernetes.io/projected/97d388e1-b1b3-409d-b7c5-38b37734a8e6-kube-api-access-9whj8\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.113153 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-scripts\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.113266 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d388e1-b1b3-409d-b7c5-38b37734a8e6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.113791 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-log-httpd\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.114329 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-run-httpd\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.118655 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.119357 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.123086 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-scripts\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.124675 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-config-data\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.130560 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk9pt\" (UniqueName: \"kubernetes.io/projected/0d56db73-0e9e-47af-b0bd-77231fe40077-kube-api-access-tk9pt\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.147394 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.159663 4898 scope.go:117] "RemoveContainer" containerID="996fc46edbed8602f9eda3a09dc63ac36038779496a63c3ff3ba77a3b3a9e5b0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.214110 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thsgw\" (UniqueName: \"kubernetes.io/projected/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-kube-api-access-thsgw\") pod \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.214334 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-internal-tls-certs\") pod \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.214372 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-logs\") pod \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.214533 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-combined-ca-bundle\") pod \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.214794 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-config-data\") pod \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.214816 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-public-tls-certs\") pod \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.214852 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-logs" (OuterVolumeSpecName: "logs") pod "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" (UID: "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.215768 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d388e1-b1b3-409d-b7c5-38b37734a8e6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.216088 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97d388e1-b1b3-409d-b7c5-38b37734a8e6-config-data\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.216195 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9whj8\" (UniqueName: \"kubernetes.io/projected/97d388e1-b1b3-409d-b7c5-38b37734a8e6-kube-api-access-9whj8\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.216328 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.217434 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-kube-api-access-thsgw" (OuterVolumeSpecName: "kube-api-access-thsgw") pod "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" (UID: "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee"). InnerVolumeSpecName "kube-api-access-thsgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.222190 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d388e1-b1b3-409d-b7c5-38b37734a8e6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.222519 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97d388e1-b1b3-409d-b7c5-38b37734a8e6-config-data\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.235910 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9whj8\" (UniqueName: \"kubernetes.io/projected/97d388e1-b1b3-409d-b7c5-38b37734a8e6-kube-api-access-9whj8\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.315555 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-config-data" (OuterVolumeSpecName: "config-data") pod "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" (UID: "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.319472 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.319503 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thsgw\" (UniqueName: \"kubernetes.io/projected/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-kube-api-access-thsgw\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.371240 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" (UID: "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.397881 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" (UID: "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.408292 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" (UID: "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.421180 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.421209 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.421220 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.426755 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.438328 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.755632 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" path="/var/lib/kubelet/pods/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7/volumes" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.757174 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cfb3db3-7d46-4ab5-aecc-00ddd738d359" path="/var/lib/kubelet/pods/9cfb3db3-7d46-4ab5-aecc-00ddd738d359/volumes" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.757836 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" path="/var/lib/kubelet/pods/a8db736b-00b7-4251-a667-3b2138c6c928/volumes" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.859714 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17","Type":"ContainerStarted","Data":"af208a37c486aa4aae0677032ff0e628bf9613329753d31ce19db678648487fb"} Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.859785 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17","Type":"ContainerStarted","Data":"97673219690d9afe13f9d5f67d427b244d491868266d4831070e61c4d58caaf1"} Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.864267 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.865017 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee","Type":"ContainerDied","Data":"e928948e8aff5ca3e9c4f8ba788b16341b26756ed4665c742a244420ba53b0dc"} Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.865447 4898 scope.go:117] "RemoveContainer" containerID="7f380ed1e03850f9a5390c93de71ece12783543d722324b0cebc7392d71d1cf4" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.897472 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.897453862 podStartE2EDuration="2.897453862s" podCreationTimestamp="2026-03-13 14:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:24:13.884556303 +0000 UTC m=+1688.886144562" watchObservedRunningTime="2026-03-13 14:24:13.897453862 +0000 UTC m=+1688.899042111" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.912726 4898 scope.go:117] "RemoveContainer" containerID="61b5d14b4c28a00305c933cc1ba3f69720704610d85741eff661575a32229c94" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.978048 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.991407 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.003572 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 14:24:14 crc kubenswrapper[4898]: E0313 14:24:14.004158 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-api" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.004172 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-api" Mar 13 14:24:14 crc kubenswrapper[4898]: E0313 14:24:14.004187 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-log" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.004193 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-log" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.004459 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-log" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.004491 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-api" Mar 13 14:24:14 crc kubenswrapper[4898]: E0313 14:24:14.006282 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b7e79b7_e581_4429_b3d4_9dd7ec5e79ee.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.007817 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.011002 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.011864 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.012235 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.016216 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.029826 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.054428 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.054496 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-config-data\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.054525 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.054626 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7dd576-1005-4fdb-95c1-e5da9f04b177-logs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.054652 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-public-tls-certs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.054836 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7ds9\" (UniqueName: \"kubernetes.io/projected/ef7dd576-1005-4fdb-95c1-e5da9f04b177-kube-api-access-w7ds9\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.087202 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.156822 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-config-data\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.157234 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.157281 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7dd576-1005-4fdb-95c1-e5da9f04b177-logs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.157307 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-public-tls-certs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.157474 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7ds9\" (UniqueName: \"kubernetes.io/projected/ef7dd576-1005-4fdb-95c1-e5da9f04b177-kube-api-access-w7ds9\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.157562 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.158499 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7dd576-1005-4fdb-95c1-e5da9f04b177-logs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.161716 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.162771 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.163009 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-public-tls-certs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.170114 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-config-data\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.175298 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7ds9\" (UniqueName: \"kubernetes.io/projected/ef7dd576-1005-4fdb-95c1-e5da9f04b177-kube-api-access-w7ds9\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.359392 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.860744 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.889631 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerStarted","Data":"f93c4099691f41a073e964731ace4e3b38e62bd41c90cb8a30591394175252ce"} Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.889834 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerStarted","Data":"a79c1dd41b2637198b516933e99a0d19e49eb0f06da00b903e6a8bd5f7dcb1dc"} Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.893110 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"97d388e1-b1b3-409d-b7c5-38b37734a8e6","Type":"ContainerStarted","Data":"f1f9f7f0ddc965e542d47521c647bde30cf7c3d61167e79e93c17ed0e017da76"} Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.893160 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"97d388e1-b1b3-409d-b7c5-38b37734a8e6","Type":"ContainerStarted","Data":"bec9e188e91a6b36561be6565cd152626eaf887f736335a6a7ec4277b6b37808"} Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.902599 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef7dd576-1005-4fdb-95c1-e5da9f04b177","Type":"ContainerStarted","Data":"2450a754a8cdfb08dca6764a1e6216668ee4da861f43efd9d3d8821b9ac67477"} Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.907659 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.907636507 podStartE2EDuration="2.907636507s" podCreationTimestamp="2026-03-13 14:24:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:24:14.907095382 +0000 UTC m=+1689.908683621" watchObservedRunningTime="2026-03-13 14:24:14.907636507 +0000 UTC m=+1689.909224746" Mar 13 14:24:15 crc kubenswrapper[4898]: I0313 14:24:15.757459 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" path="/var/lib/kubelet/pods/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee/volumes" Mar 13 14:24:15 crc kubenswrapper[4898]: I0313 14:24:15.917366 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef7dd576-1005-4fdb-95c1-e5da9f04b177","Type":"ContainerStarted","Data":"5f2534883f510a8fdd5e23cb7dccb85906de337bd6a61fc10f1d9b37d0c03f02"} Mar 13 14:24:15 crc kubenswrapper[4898]: I0313 14:24:15.917409 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef7dd576-1005-4fdb-95c1-e5da9f04b177","Type":"ContainerStarted","Data":"a596a8e0608b567b537c0d150b0e75f3ff075578d78549760a35e8f2f70708a3"} Mar 13 14:24:15 crc kubenswrapper[4898]: I0313 14:24:15.929476 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerStarted","Data":"72a159d4ac7d8e712d9bffb75add791bdb6f8bee323ccb72d06431a42be68d77"} Mar 13 14:24:15 crc kubenswrapper[4898]: I0313 14:24:15.956085 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.956045875 podStartE2EDuration="2.956045875s" podCreationTimestamp="2026-03-13 14:24:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:24:15.945703773 +0000 UTC m=+1690.947292022" watchObservedRunningTime="2026-03-13 14:24:15.956045875 +0000 UTC m=+1690.957634114" Mar 13 14:24:16 crc kubenswrapper[4898]: I0313 14:24:16.943579 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerStarted","Data":"f40862770a13b51268c0d1e7b9c2896a6335c6f3b3801074579c313c3d13577e"} Mar 13 14:24:17 crc kubenswrapper[4898]: I0313 14:24:17.957645 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerStarted","Data":"a88dfb7f27e61d13c4de942fec09301aec287d6967f6ca991e690f9c9c77a8e1"} Mar 13 14:24:17 crc kubenswrapper[4898]: I0313 14:24:17.957950 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:24:17 crc kubenswrapper[4898]: I0313 14:24:17.996378 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.343281801 podStartE2EDuration="5.996358618s" podCreationTimestamp="2026-03-13 14:24:12 +0000 UTC" firstStartedPulling="2026-03-13 14:24:13.968445638 +0000 UTC m=+1688.970033877" lastFinishedPulling="2026-03-13 14:24:17.621522455 +0000 UTC m=+1692.623110694" observedRunningTime="2026-03-13 14:24:17.978263253 +0000 UTC m=+1692.979851502" watchObservedRunningTime="2026-03-13 14:24:17.996358618 +0000 UTC m=+1692.997946857" Mar 13 14:24:18 crc kubenswrapper[4898]: I0313 14:24:18.439878 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 14:24:21 crc kubenswrapper[4898]: I0313 14:24:21.897456 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:21 crc kubenswrapper[4898]: I0313 14:24:21.966015 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:22 crc kubenswrapper[4898]: I0313 14:24:22.148081 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-plhgx"] Mar 13 14:24:22 crc kubenswrapper[4898]: I0313 14:24:22.185445 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 14:24:22 crc kubenswrapper[4898]: I0313 14:24:22.185519 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.018930 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-plhgx" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="registry-server" containerID="cri-o://201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1" gracePeriod=2 Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.201061 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.18:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.201092 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.18:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.440149 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.470865 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.655770 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.720279 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-catalog-content\") pod \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.720446 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-utilities\") pod \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.720650 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6945l\" (UniqueName: \"kubernetes.io/projected/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-kube-api-access-6945l\") pod \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.726102 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-utilities" (OuterVolumeSpecName: "utilities") pod "c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" (UID: "c75348dc-b6ff-43ff-bd9a-d84c91f23ea8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.728644 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-kube-api-access-6945l" (OuterVolumeSpecName: "kube-api-access-6945l") pod "c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" (UID: "c75348dc-b6ff-43ff-bd9a-d84c91f23ea8"). InnerVolumeSpecName "kube-api-access-6945l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.743872 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:24:23 crc kubenswrapper[4898]: E0313 14:24:23.744216 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.781669 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" (UID: "c75348dc-b6ff-43ff-bd9a-d84c91f23ea8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.824024 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6945l\" (UniqueName: \"kubernetes.io/projected/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-kube-api-access-6945l\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.824058 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.824067 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.038272 4898 generic.go:334] "Generic (PLEG): container finished" podID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerID="201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1" exitCode=0 Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.038381 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.038409 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plhgx" event={"ID":"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8","Type":"ContainerDied","Data":"201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1"} Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.047571 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plhgx" event={"ID":"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8","Type":"ContainerDied","Data":"357efb5d307770e6ac4560f42dea2e3bc32a6f8f90350563a876139972493b91"} Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.047633 4898 scope.go:117] "RemoveContainer" containerID="201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.095301 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.095300 4898 scope.go:117] "RemoveContainer" containerID="b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.101679 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-plhgx"] Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.128939 4898 scope.go:117] "RemoveContainer" containerID="327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.139606 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-plhgx"] Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.194765 4898 scope.go:117] "RemoveContainer" containerID="201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1" Mar 13 14:24:24 crc kubenswrapper[4898]: E0313 14:24:24.195220 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1\": container with ID starting with 201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1 not found: ID does not exist" containerID="201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.195256 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1"} err="failed to get container status \"201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1\": rpc error: code = NotFound desc = could not find container \"201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1\": container with ID starting with 201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1 not found: ID does not exist" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.195283 4898 scope.go:117] "RemoveContainer" containerID="b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca" Mar 13 14:24:24 crc kubenswrapper[4898]: E0313 14:24:24.195591 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca\": container with ID starting with b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca not found: ID does not exist" containerID="b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.195612 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca"} err="failed to get container status \"b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca\": rpc error: code = NotFound desc = could not find container \"b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca\": container with ID starting with b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca not found: ID does not exist" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.195631 4898 scope.go:117] "RemoveContainer" containerID="327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c" Mar 13 14:24:24 crc kubenswrapper[4898]: E0313 14:24:24.195876 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c\": container with ID starting with 327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c not found: ID does not exist" containerID="327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.195911 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c"} err="failed to get container status \"327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c\": rpc error: code = NotFound desc = could not find container \"327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c\": container with ID starting with 327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c not found: ID does not exist" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.360296 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.360361 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 14:24:25 crc kubenswrapper[4898]: I0313 14:24:25.372052 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ef7dd576-1005-4fdb-95c1-e5da9f04b177" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.21:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:24:25 crc kubenswrapper[4898]: I0313 14:24:25.372078 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ef7dd576-1005-4fdb-95c1-e5da9f04b177" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.21:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:24:25 crc kubenswrapper[4898]: I0313 14:24:25.758356 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" path="/var/lib/kubelet/pods/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8/volumes" Mar 13 14:24:30 crc kubenswrapper[4898]: I0313 14:24:30.187996 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 14:24:30 crc kubenswrapper[4898]: I0313 14:24:30.188617 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 14:24:32 crc kubenswrapper[4898]: I0313 14:24:32.192372 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 14:24:32 crc kubenswrapper[4898]: I0313 14:24:32.194616 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 14:24:32 crc kubenswrapper[4898]: I0313 14:24:32.200838 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 14:24:32 crc kubenswrapper[4898]: I0313 14:24:32.201882 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 14:24:32 crc kubenswrapper[4898]: I0313 14:24:32.360515 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 14:24:32 crc kubenswrapper[4898]: I0313 14:24:32.360575 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 14:24:34 crc kubenswrapper[4898]: I0313 14:24:34.369630 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 14:24:34 crc kubenswrapper[4898]: I0313 14:24:34.372702 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 14:24:34 crc kubenswrapper[4898]: I0313 14:24:34.381315 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 14:24:34 crc kubenswrapper[4898]: I0313 14:24:34.442929 4898 scope.go:117] "RemoveContainer" containerID="c8c6599b57d68b7830c9784f9ac2322559fa7b500359ca53fa26e39b23292ec4" Mar 13 14:24:35 crc kubenswrapper[4898]: I0313 14:24:35.208677 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.214692 4898 generic.go:334] "Generic (PLEG): container finished" podID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerID="e2cbd875c5a5e2c192552e4d11fe9c7d29be7ff5b49fb690e14597f7135e2adb" exitCode=137 Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.214757 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerDied","Data":"e2cbd875c5a5e2c192552e4d11fe9c7d29be7ff5b49fb690e14597f7135e2adb"} Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.739950 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:24:36 crc kubenswrapper[4898]: E0313 14:24:36.740627 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.810002 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.878352 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw6dd\" (UniqueName: \"kubernetes.io/projected/3a036241-2013-494e-8c1f-7584e9af2bf4-kube-api-access-sw6dd\") pod \"3a036241-2013-494e-8c1f-7584e9af2bf4\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.878411 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-scripts\") pod \"3a036241-2013-494e-8c1f-7584e9af2bf4\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.878644 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-config-data\") pod \"3a036241-2013-494e-8c1f-7584e9af2bf4\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.878752 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-combined-ca-bundle\") pod \"3a036241-2013-494e-8c1f-7584e9af2bf4\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.889283 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-scripts" (OuterVolumeSpecName: "scripts") pod "3a036241-2013-494e-8c1f-7584e9af2bf4" (UID: "3a036241-2013-494e-8c1f-7584e9af2bf4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.902156 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a036241-2013-494e-8c1f-7584e9af2bf4-kube-api-access-sw6dd" (OuterVolumeSpecName: "kube-api-access-sw6dd") pod "3a036241-2013-494e-8c1f-7584e9af2bf4" (UID: "3a036241-2013-494e-8c1f-7584e9af2bf4"). InnerVolumeSpecName "kube-api-access-sw6dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.982637 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw6dd\" (UniqueName: \"kubernetes.io/projected/3a036241-2013-494e-8c1f-7584e9af2bf4-kube-api-access-sw6dd\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.982678 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.048023 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-config-data" (OuterVolumeSpecName: "config-data") pod "3a036241-2013-494e-8c1f-7584e9af2bf4" (UID: "3a036241-2013-494e-8c1f-7584e9af2bf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.086147 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a036241-2013-494e-8c1f-7584e9af2bf4" (UID: "3a036241-2013-494e-8c1f-7584e9af2bf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.087138 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.087163 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.230077 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerDied","Data":"ded3b65c989e6a6e858ee713ff395a11604658cb153b63189d68172abd0b0293"} Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.230122 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.230133 4898 scope.go:117] "RemoveContainer" containerID="e2cbd875c5a5e2c192552e4d11fe9c7d29be7ff5b49fb690e14597f7135e2adb" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.262125 4898 scope.go:117] "RemoveContainer" containerID="e1faed09b83d1750f543cd522c4a6bcbb14737623c5d254b367266c190bdbe2c" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.310735 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.324880 4898 scope.go:117] "RemoveContainer" containerID="9263a833a783d5b9000e728c8a69913160176a9e9c946c16d7fa08425ffcc556" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.325089 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.339087 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 13 14:24:37 crc kubenswrapper[4898]: E0313 14:24:37.339675 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-listener" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.339697 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-listener" Mar 13 14:24:37 crc kubenswrapper[4898]: E0313 14:24:37.339737 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="extract-utilities" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.339747 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="extract-utilities" Mar 13 14:24:37 crc kubenswrapper[4898]: E0313 14:24:37.339759 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="registry-server" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.339767 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="registry-server" Mar 13 14:24:37 crc kubenswrapper[4898]: E0313 14:24:37.339779 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="extract-content" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.339787 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="extract-content" Mar 13 14:24:37 crc kubenswrapper[4898]: E0313 14:24:37.339819 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-notifier" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.339827 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-notifier" Mar 13 14:24:37 crc kubenswrapper[4898]: E0313 14:24:37.339839 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-api" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.339846 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-api" Mar 13 14:24:37 crc kubenswrapper[4898]: E0313 14:24:37.339866 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-evaluator" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.339874 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-evaluator" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.340168 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-api" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.340182 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-listener" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.340215 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="registry-server" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.340234 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-notifier" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.340247 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-evaluator" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.345620 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.347814 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.347862 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.348010 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.348043 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-tnpwg" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.347821 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.351085 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.358869 4898 scope.go:117] "RemoveContainer" containerID="1e2c142eba973e7412047a391872c9d25e5735c5b05576796ed64cb74c786bb5" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.394501 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-internal-tls-certs\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.394599 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-scripts\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.394636 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-config-data\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.394698 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-public-tls-certs\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.394751 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-combined-ca-bundle\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.394851 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxbj9\" (UniqueName: \"kubernetes.io/projected/88246540-ca61-4fb0-8934-c8ebb4559860-kube-api-access-sxbj9\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.496833 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-config-data\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.496955 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-public-tls-certs\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.497006 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-combined-ca-bundle\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.497087 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxbj9\" (UniqueName: \"kubernetes.io/projected/88246540-ca61-4fb0-8934-c8ebb4559860-kube-api-access-sxbj9\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.497135 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-internal-tls-certs\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.497174 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-scripts\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.510504 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-public-tls-certs\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.511019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-config-data\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.511431 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-combined-ca-bundle\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.512875 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxbj9\" (UniqueName: \"kubernetes.io/projected/88246540-ca61-4fb0-8934-c8ebb4559860-kube-api-access-sxbj9\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.515343 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-scripts\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.518723 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-internal-tls-certs\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.673302 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.759508 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" path="/var/lib/kubelet/pods/3a036241-2013-494e-8c1f-7584e9af2bf4/volumes" Mar 13 14:24:38 crc kubenswrapper[4898]: W0313 14:24:38.205544 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88246540_ca61_4fb0_8934_c8ebb4559860.slice/crio-4f9554aea31a54e9ad03a3bc5d51fd2b9355c4b2f2434a00fdafabcd84f13b07 WatchSource:0}: Error finding container 4f9554aea31a54e9ad03a3bc5d51fd2b9355c4b2f2434a00fdafabcd84f13b07: Status 404 returned error can't find the container with id 4f9554aea31a54e9ad03a3bc5d51fd2b9355c4b2f2434a00fdafabcd84f13b07 Mar 13 14:24:38 crc kubenswrapper[4898]: I0313 14:24:38.208129 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 14:24:38 crc kubenswrapper[4898]: I0313 14:24:38.278840 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerStarted","Data":"4f9554aea31a54e9ad03a3bc5d51fd2b9355c4b2f2434a00fdafabcd84f13b07"} Mar 13 14:24:40 crc kubenswrapper[4898]: I0313 14:24:40.300404 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerStarted","Data":"a274deee7baf2c38ad5a6692d8a099f9a97dda17b39d57d5a6fb5cd7aca71860"} Mar 13 14:24:42 crc kubenswrapper[4898]: I0313 14:24:42.335834 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerStarted","Data":"8453994fd2156143da3704e7af63c854727a63f89845c2f4e51b2efe260b622e"} Mar 13 14:24:43 crc kubenswrapper[4898]: I0313 14:24:43.348716 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerStarted","Data":"c83e4fda188ec43992d3ce1b3047566dea50f419ae2e6389d523891cfdc5bf75"} Mar 13 14:24:43 crc kubenswrapper[4898]: I0313 14:24:43.436247 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 14:24:44 crc kubenswrapper[4898]: I0313 14:24:44.361232 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerStarted","Data":"f1a7b03523d4185dcbadb339dd340f86f0fe7637d1feff68130acfc4930e6831"} Mar 13 14:24:44 crc kubenswrapper[4898]: I0313 14:24:44.399754 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.823534532 podStartE2EDuration="7.399733662s" podCreationTimestamp="2026-03-13 14:24:37 +0000 UTC" firstStartedPulling="2026-03-13 14:24:38.207799518 +0000 UTC m=+1713.209387757" lastFinishedPulling="2026-03-13 14:24:43.783998648 +0000 UTC m=+1718.785586887" observedRunningTime="2026-03-13 14:24:44.382937811 +0000 UTC m=+1719.384526070" watchObservedRunningTime="2026-03-13 14:24:44.399733662 +0000 UTC m=+1719.401321901" Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.129863 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.130574 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4e010381-921d-4328-9027-ddb9a54a08bd" containerName="kube-state-metrics" containerID="cri-o://71740b094889fce6ef9ef07ec41cbfdf46a3a1807d9a456b2458bac02fa10682" gracePeriod=30 Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.231830 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.232353 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="8c27f029-bffd-4f8f-bb24-c1c9c245d38c" containerName="mysqld-exporter" containerID="cri-o://169bac7a2a87f00863ce79b3524b65b62b7c3fd09f57a0d5d216a587ead2fc00" gracePeriod=30 Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.434482 4898 generic.go:334] "Generic (PLEG): container finished" podID="4e010381-921d-4328-9027-ddb9a54a08bd" containerID="71740b094889fce6ef9ef07ec41cbfdf46a3a1807d9a456b2458bac02fa10682" exitCode=2 Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.434568 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4e010381-921d-4328-9027-ddb9a54a08bd","Type":"ContainerDied","Data":"71740b094889fce6ef9ef07ec41cbfdf46a3a1807d9a456b2458bac02fa10682"} Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.436526 4898 generic.go:334] "Generic (PLEG): container finished" podID="8c27f029-bffd-4f8f-bb24-c1c9c245d38c" containerID="169bac7a2a87f00863ce79b3524b65b62b7c3fd09f57a0d5d216a587ead2fc00" exitCode=2 Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.436575 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"8c27f029-bffd-4f8f-bb24-c1c9c245d38c","Type":"ContainerDied","Data":"169bac7a2a87f00863ce79b3524b65b62b7c3fd09f57a0d5d216a587ead2fc00"} Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.659576 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.818980 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p58m\" (UniqueName: \"kubernetes.io/projected/4e010381-921d-4328-9027-ddb9a54a08bd-kube-api-access-5p58m\") pod \"4e010381-921d-4328-9027-ddb9a54a08bd\" (UID: \"4e010381-921d-4328-9027-ddb9a54a08bd\") " Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.828183 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e010381-921d-4328-9027-ddb9a54a08bd-kube-api-access-5p58m" (OuterVolumeSpecName: "kube-api-access-5p58m") pod "4e010381-921d-4328-9027-ddb9a54a08bd" (UID: "4e010381-921d-4328-9027-ddb9a54a08bd"). InnerVolumeSpecName "kube-api-access-5p58m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.855937 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.923963 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p58m\" (UniqueName: \"kubernetes.io/projected/4e010381-921d-4328-9027-ddb9a54a08bd-kube-api-access-5p58m\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.025147 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-config-data\") pod \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.025274 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-combined-ca-bundle\") pod \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.025309 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8775h\" (UniqueName: \"kubernetes.io/projected/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-kube-api-access-8775h\") pod \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.029241 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-kube-api-access-8775h" (OuterVolumeSpecName: "kube-api-access-8775h") pod "8c27f029-bffd-4f8f-bb24-c1c9c245d38c" (UID: "8c27f029-bffd-4f8f-bb24-c1c9c245d38c"). InnerVolumeSpecName "kube-api-access-8775h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.078421 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c27f029-bffd-4f8f-bb24-c1c9c245d38c" (UID: "8c27f029-bffd-4f8f-bb24-c1c9c245d38c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.082204 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-config-data" (OuterVolumeSpecName: "config-data") pod "8c27f029-bffd-4f8f-bb24-c1c9c245d38c" (UID: "8c27f029-bffd-4f8f-bb24-c1c9c245d38c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.129053 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.129105 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.129126 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8775h\" (UniqueName: \"kubernetes.io/projected/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-kube-api-access-8775h\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.451950 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4e010381-921d-4328-9027-ddb9a54a08bd","Type":"ContainerDied","Data":"a514287f4abd02abdb35f5efc576784d286f154281545d6e3b18397fdacfa325"} Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.452002 4898 scope.go:117] "RemoveContainer" containerID="71740b094889fce6ef9ef07ec41cbfdf46a3a1807d9a456b2458bac02fa10682" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.452052 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.456154 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"8c27f029-bffd-4f8f-bb24-c1c9c245d38c","Type":"ContainerDied","Data":"fe553b08f29dc87c01c836389227794f6bc900596f5a85dd1ed792d64aa19876"} Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.456198 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.491682 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.509646 4898 scope.go:117] "RemoveContainer" containerID="169bac7a2a87f00863ce79b3524b65b62b7c3fd09f57a0d5d216a587ead2fc00" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.538984 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.563950 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:24:49 crc kubenswrapper[4898]: E0313 14:24:49.564422 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e010381-921d-4328-9027-ddb9a54a08bd" containerName="kube-state-metrics" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.564441 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e010381-921d-4328-9027-ddb9a54a08bd" containerName="kube-state-metrics" Mar 13 14:24:49 crc kubenswrapper[4898]: E0313 14:24:49.564466 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c27f029-bffd-4f8f-bb24-c1c9c245d38c" containerName="mysqld-exporter" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.564473 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c27f029-bffd-4f8f-bb24-c1c9c245d38c" containerName="mysqld-exporter" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.564717 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c27f029-bffd-4f8f-bb24-c1c9c245d38c" containerName="mysqld-exporter" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.565570 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e010381-921d-4328-9027-ddb9a54a08bd" containerName="kube-state-metrics" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.566536 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.570483 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.573713 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.598870 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.623821 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.640805 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.653317 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.655384 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.658546 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.658973 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.664675 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.664738 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-config-data\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.664782 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6w4q\" (UniqueName: \"kubernetes.io/projected/8a198c14-e13f-4858-87c4-de6be0fa8d0c-kube-api-access-b6w4q\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.664884 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.664953 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.665147 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcjgw\" (UniqueName: \"kubernetes.io/projected/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-api-access-zcjgw\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.665251 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.665275 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.670225 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.751208 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e010381-921d-4328-9027-ddb9a54a08bd" path="/var/lib/kubelet/pods/4e010381-921d-4328-9027-ddb9a54a08bd/volumes" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.751779 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c27f029-bffd-4f8f-bb24-c1c9c245d38c" path="/var/lib/kubelet/pods/8c27f029-bffd-4f8f-bb24-c1c9c245d38c/volumes" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.767594 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.767663 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-config-data\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.767702 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6w4q\" (UniqueName: \"kubernetes.io/projected/8a198c14-e13f-4858-87c4-de6be0fa8d0c-kube-api-access-b6w4q\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.767773 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.767808 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.767876 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcjgw\" (UniqueName: \"kubernetes.io/projected/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-api-access-zcjgw\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.767949 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.767980 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.773303 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.773393 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.773521 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-config-data\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.773521 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.773588 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.774124 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.785821 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcjgw\" (UniqueName: \"kubernetes.io/projected/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-api-access-zcjgw\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.787078 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6w4q\" (UniqueName: \"kubernetes.io/projected/8a198c14-e13f-4858-87c4-de6be0fa8d0c-kube-api-access-b6w4q\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.911399 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.976368 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.260868 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.261463 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="ceilometer-central-agent" containerID="cri-o://f93c4099691f41a073e964731ace4e3b38e62bd41c90cb8a30591394175252ce" gracePeriod=30 Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.262040 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="proxy-httpd" containerID="cri-o://a88dfb7f27e61d13c4de942fec09301aec287d6967f6ca991e690f9c9c77a8e1" gracePeriod=30 Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.262075 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="ceilometer-notification-agent" containerID="cri-o://72a159d4ac7d8e712d9bffb75add791bdb6f8bee323ccb72d06431a42be68d77" gracePeriod=30 Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.262059 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="sg-core" containerID="cri-o://f40862770a13b51268c0d1e7b9c2896a6335c6f3b3801074579c313c3d13577e" gracePeriod=30 Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.444798 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.577830 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b7452a36-0169-4cfe-9ede-ef4d0ef072d9","Type":"ContainerStarted","Data":"55f7eb4777f7efaa80315786f8c6ff46779e1a725ad27fb01a0326943ceba0cb"} Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.632284 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.660594 4898 generic.go:334] "Generic (PLEG): container finished" podID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerID="a88dfb7f27e61d13c4de942fec09301aec287d6967f6ca991e690f9c9c77a8e1" exitCode=0 Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.660641 4898 generic.go:334] "Generic (PLEG): container finished" podID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerID="f40862770a13b51268c0d1e7b9c2896a6335c6f3b3801074579c313c3d13577e" exitCode=2 Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.660663 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerDied","Data":"a88dfb7f27e61d13c4de942fec09301aec287d6967f6ca991e690f9c9c77a8e1"} Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.660688 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerDied","Data":"f40862770a13b51268c0d1e7b9c2896a6335c6f3b3801074579c313c3d13577e"} Mar 13 14:24:51 crc kubenswrapper[4898]: I0313 14:24:51.679414 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"8a198c14-e13f-4858-87c4-de6be0fa8d0c","Type":"ContainerStarted","Data":"27a74493bbe48a66536d1e7cb863f2ab01d0775f8744b183e0e5dbb10c8028c3"} Mar 13 14:24:51 crc kubenswrapper[4898]: I0313 14:24:51.679980 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"8a198c14-e13f-4858-87c4-de6be0fa8d0c","Type":"ContainerStarted","Data":"e5cbcba754d8ed9be93f24af64837c85b3284b2874ba52299b17de05cb9bfbbf"} Mar 13 14:24:51 crc kubenswrapper[4898]: I0313 14:24:51.705719 4898 generic.go:334] "Generic (PLEG): container finished" podID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerID="f93c4099691f41a073e964731ace4e3b38e62bd41c90cb8a30591394175252ce" exitCode=0 Mar 13 14:24:51 crc kubenswrapper[4898]: I0313 14:24:51.705773 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerDied","Data":"f93c4099691f41a073e964731ace4e3b38e62bd41c90cb8a30591394175252ce"} Mar 13 14:24:51 crc kubenswrapper[4898]: I0313 14:24:51.712820 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.165032711 podStartE2EDuration="2.712793288s" podCreationTimestamp="2026-03-13 14:24:49 +0000 UTC" firstStartedPulling="2026-03-13 14:24:50.658528516 +0000 UTC m=+1725.660116755" lastFinishedPulling="2026-03-13 14:24:51.206289093 +0000 UTC m=+1726.207877332" observedRunningTime="2026-03-13 14:24:51.700195057 +0000 UTC m=+1726.701783306" watchObservedRunningTime="2026-03-13 14:24:51.712793288 +0000 UTC m=+1726.714381557" Mar 13 14:24:51 crc kubenswrapper[4898]: I0313 14:24:51.742660 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:24:51 crc kubenswrapper[4898]: E0313 14:24:51.742990 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.721107 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b7452a36-0169-4cfe-9ede-ef4d0ef072d9","Type":"ContainerStarted","Data":"b4de4756fb05eb0a0367dd16335fa2c88e7c0f23e270a7655f16e0624156257d"} Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.721530 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.729413 4898 generic.go:334] "Generic (PLEG): container finished" podID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerID="72a159d4ac7d8e712d9bffb75add791bdb6f8bee323ccb72d06431a42be68d77" exitCode=0 Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.729680 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerDied","Data":"72a159d4ac7d8e712d9bffb75add791bdb6f8bee323ccb72d06431a42be68d77"} Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.729725 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerDied","Data":"a79c1dd41b2637198b516933e99a0d19e49eb0f06da00b903e6a8bd5f7dcb1dc"} Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.729738 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a79c1dd41b2637198b516933e99a0d19e49eb0f06da00b903e6a8bd5f7dcb1dc" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.771475 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.789728 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.435414337 podStartE2EDuration="3.789704836s" podCreationTimestamp="2026-03-13 14:24:49 +0000 UTC" firstStartedPulling="2026-03-13 14:24:50.496189958 +0000 UTC m=+1725.497778197" lastFinishedPulling="2026-03-13 14:24:51.850480457 +0000 UTC m=+1726.852068696" observedRunningTime="2026-03-13 14:24:52.741368456 +0000 UTC m=+1727.742956695" watchObservedRunningTime="2026-03-13 14:24:52.789704836 +0000 UTC m=+1727.791293095" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.903881 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-run-httpd\") pod \"0d56db73-0e9e-47af-b0bd-77231fe40077\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.904087 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-scripts\") pod \"0d56db73-0e9e-47af-b0bd-77231fe40077\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.904156 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-combined-ca-bundle\") pod \"0d56db73-0e9e-47af-b0bd-77231fe40077\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.904233 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-config-data\") pod \"0d56db73-0e9e-47af-b0bd-77231fe40077\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.904386 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk9pt\" (UniqueName: \"kubernetes.io/projected/0d56db73-0e9e-47af-b0bd-77231fe40077-kube-api-access-tk9pt\") pod \"0d56db73-0e9e-47af-b0bd-77231fe40077\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.904449 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-log-httpd\") pod \"0d56db73-0e9e-47af-b0bd-77231fe40077\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.904568 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0d56db73-0e9e-47af-b0bd-77231fe40077" (UID: "0d56db73-0e9e-47af-b0bd-77231fe40077"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.904605 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-sg-core-conf-yaml\") pod \"0d56db73-0e9e-47af-b0bd-77231fe40077\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.904851 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0d56db73-0e9e-47af-b0bd-77231fe40077" (UID: "0d56db73-0e9e-47af-b0bd-77231fe40077"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.909758 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-scripts" (OuterVolumeSpecName: "scripts") pod "0d56db73-0e9e-47af-b0bd-77231fe40077" (UID: "0d56db73-0e9e-47af-b0bd-77231fe40077"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.912236 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.912263 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.912272 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.918289 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d56db73-0e9e-47af-b0bd-77231fe40077-kube-api-access-tk9pt" (OuterVolumeSpecName: "kube-api-access-tk9pt") pod "0d56db73-0e9e-47af-b0bd-77231fe40077" (UID: "0d56db73-0e9e-47af-b0bd-77231fe40077"). InnerVolumeSpecName "kube-api-access-tk9pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.942892 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0d56db73-0e9e-47af-b0bd-77231fe40077" (UID: "0d56db73-0e9e-47af-b0bd-77231fe40077"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.013229 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d56db73-0e9e-47af-b0bd-77231fe40077" (UID: "0d56db73-0e9e-47af-b0bd-77231fe40077"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.014002 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-combined-ca-bundle\") pod \"0d56db73-0e9e-47af-b0bd-77231fe40077\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " Mar 13 14:24:53 crc kubenswrapper[4898]: W0313 14:24:53.014135 4898 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/0d56db73-0e9e-47af-b0bd-77231fe40077/volumes/kubernetes.io~secret/combined-ca-bundle Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.014152 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d56db73-0e9e-47af-b0bd-77231fe40077" (UID: "0d56db73-0e9e-47af-b0bd-77231fe40077"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.014706 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.014726 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk9pt\" (UniqueName: \"kubernetes.io/projected/0d56db73-0e9e-47af-b0bd-77231fe40077-kube-api-access-tk9pt\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.014739 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.032076 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-config-data" (OuterVolumeSpecName: "config-data") pod "0d56db73-0e9e-47af-b0bd-77231fe40077" (UID: "0d56db73-0e9e-47af-b0bd-77231fe40077"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.117072 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.739164 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.775829 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.787695 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.804463 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:53 crc kubenswrapper[4898]: E0313 14:24:53.805281 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="sg-core" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.805381 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="sg-core" Mar 13 14:24:53 crc kubenswrapper[4898]: E0313 14:24:53.805460 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="ceilometer-notification-agent" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.805512 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="ceilometer-notification-agent" Mar 13 14:24:53 crc kubenswrapper[4898]: E0313 14:24:53.805612 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="proxy-httpd" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.805670 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="proxy-httpd" Mar 13 14:24:53 crc kubenswrapper[4898]: E0313 14:24:53.805739 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="ceilometer-central-agent" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.805800 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="ceilometer-central-agent" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.806093 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="proxy-httpd" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.806213 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="ceilometer-notification-agent" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.806290 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="ceilometer-central-agent" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.806351 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="sg-core" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.808630 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.813164 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.813549 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.813873 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.815334 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.936602 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.937089 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-log-httpd\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.937349 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.937516 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-config-data\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.937817 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.938053 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-scripts\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.938262 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlm6p\" (UniqueName: \"kubernetes.io/projected/b0ac06d2-e2ea-4b4a-8201-83494b53b968-kube-api-access-xlm6p\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.938451 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-run-httpd\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.041104 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.041206 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-log-httpd\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.041248 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.041271 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-config-data\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.041415 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.041478 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-scripts\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.041530 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlm6p\" (UniqueName: \"kubernetes.io/projected/b0ac06d2-e2ea-4b4a-8201-83494b53b968-kube-api-access-xlm6p\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.041620 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-run-httpd\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.042103 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-run-httpd\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.042433 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-log-httpd\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.048537 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-config-data\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.048690 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.048563 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-scripts\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.055637 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.057832 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.059211 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlm6p\" (UniqueName: \"kubernetes.io/projected/b0ac06d2-e2ea-4b4a-8201-83494b53b968-kube-api-access-xlm6p\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.134192 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.694435 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.754612 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerStarted","Data":"efc08f6e1f1c1bae444792fe7fa9bd5076b4f986e80e97128cc5f1ec8235c524"} Mar 13 14:24:55 crc kubenswrapper[4898]: I0313 14:24:55.763321 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" path="/var/lib/kubelet/pods/0d56db73-0e9e-47af-b0bd-77231fe40077/volumes" Mar 13 14:24:55 crc kubenswrapper[4898]: I0313 14:24:55.771688 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerStarted","Data":"dbe55f5873c440c467d8748cfaa995fee6ccd0abf9441bf1f70ed0dda90073d3"} Mar 13 14:24:56 crc kubenswrapper[4898]: I0313 14:24:56.785689 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerStarted","Data":"c66d5e607033edc265c5c4c3b44b5d453515d5500b6db940b367950853043279"} Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.413484 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-zgt75"] Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.424429 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-zgt75"] Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.492518 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-kxtcf"] Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.494119 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.512217 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-kxtcf"] Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.626803 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-combined-ca-bundle\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.626875 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z25rd\" (UniqueName: \"kubernetes.io/projected/2cd78a2a-1bb4-461a-92cd-d705080b087a-kube-api-access-z25rd\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.626920 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-config-data\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.729039 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-combined-ca-bundle\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.729145 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z25rd\" (UniqueName: \"kubernetes.io/projected/2cd78a2a-1bb4-461a-92cd-d705080b087a-kube-api-access-z25rd\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.729192 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-config-data\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.735339 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-combined-ca-bundle\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.759051 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84a7fd24-4320-4c0e-8ded-0d455252a549" path="/var/lib/kubelet/pods/84a7fd24-4320-4c0e-8ded-0d455252a549/volumes" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.759539 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z25rd\" (UniqueName: \"kubernetes.io/projected/2cd78a2a-1bb4-461a-92cd-d705080b087a-kube-api-access-z25rd\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.761745 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-config-data\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.804528 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerStarted","Data":"c0655b2adb5618887bc26f0a3bb0d551a636cca41a03c1baf5cc0685920b55bb"} Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.814795 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:58 crc kubenswrapper[4898]: I0313 14:24:58.372286 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-kxtcf"] Mar 13 14:24:58 crc kubenswrapper[4898]: I0313 14:24:58.815817 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-kxtcf" event={"ID":"2cd78a2a-1bb4-461a-92cd-d705080b087a","Type":"ContainerStarted","Data":"f0ef8052f16886ece221ecf56528cf884da231c4fa187db604454c6c5925f956"} Mar 13 14:24:59 crc kubenswrapper[4898]: I0313 14:24:59.513441 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:24:59 crc kubenswrapper[4898]: I0313 14:24:59.843205 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerStarted","Data":"d2f87292a607e1fb76b72fbf0fd5fba62057ee2d194e12f77b3db9510fddedf2"} Mar 13 14:24:59 crc kubenswrapper[4898]: I0313 14:24:59.843459 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:24:59 crc kubenswrapper[4898]: I0313 14:24:59.874030 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.855993989 podStartE2EDuration="6.874011189s" podCreationTimestamp="2026-03-13 14:24:53 +0000 UTC" firstStartedPulling="2026-03-13 14:24:54.69045719 +0000 UTC m=+1729.692045439" lastFinishedPulling="2026-03-13 14:24:58.7084744 +0000 UTC m=+1733.710062639" observedRunningTime="2026-03-13 14:24:59.869263114 +0000 UTC m=+1734.870851363" watchObservedRunningTime="2026-03-13 14:24:59.874011189 +0000 UTC m=+1734.875599428" Mar 13 14:24:59 crc kubenswrapper[4898]: I0313 14:24:59.924108 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 14:25:00 crc kubenswrapper[4898]: I0313 14:25:00.834984 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:25:01 crc kubenswrapper[4898]: I0313 14:25:01.037053 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:25:01 crc kubenswrapper[4898]: I0313 14:25:01.869773 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="ceilometer-central-agent" containerID="cri-o://dbe55f5873c440c467d8748cfaa995fee6ccd0abf9441bf1f70ed0dda90073d3" gracePeriod=30 Mar 13 14:25:01 crc kubenswrapper[4898]: I0313 14:25:01.870249 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="proxy-httpd" containerID="cri-o://d2f87292a607e1fb76b72fbf0fd5fba62057ee2d194e12f77b3db9510fddedf2" gracePeriod=30 Mar 13 14:25:01 crc kubenswrapper[4898]: I0313 14:25:01.870405 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="sg-core" containerID="cri-o://c0655b2adb5618887bc26f0a3bb0d551a636cca41a03c1baf5cc0685920b55bb" gracePeriod=30 Mar 13 14:25:01 crc kubenswrapper[4898]: I0313 14:25:01.870456 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="ceilometer-notification-agent" containerID="cri-o://c66d5e607033edc265c5c4c3b44b5d453515d5500b6db940b367950853043279" gracePeriod=30 Mar 13 14:25:02 crc kubenswrapper[4898]: I0313 14:25:02.918285 4898 generic.go:334] "Generic (PLEG): container finished" podID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerID="d2f87292a607e1fb76b72fbf0fd5fba62057ee2d194e12f77b3db9510fddedf2" exitCode=0 Mar 13 14:25:02 crc kubenswrapper[4898]: I0313 14:25:02.919728 4898 generic.go:334] "Generic (PLEG): container finished" podID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerID="c0655b2adb5618887bc26f0a3bb0d551a636cca41a03c1baf5cc0685920b55bb" exitCode=2 Mar 13 14:25:02 crc kubenswrapper[4898]: I0313 14:25:02.919744 4898 generic.go:334] "Generic (PLEG): container finished" podID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerID="c66d5e607033edc265c5c4c3b44b5d453515d5500b6db940b367950853043279" exitCode=0 Mar 13 14:25:02 crc kubenswrapper[4898]: I0313 14:25:02.919769 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerDied","Data":"d2f87292a607e1fb76b72fbf0fd5fba62057ee2d194e12f77b3db9510fddedf2"} Mar 13 14:25:02 crc kubenswrapper[4898]: I0313 14:25:02.919801 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerDied","Data":"c0655b2adb5618887bc26f0a3bb0d551a636cca41a03c1baf5cc0685920b55bb"} Mar 13 14:25:02 crc kubenswrapper[4898]: I0313 14:25:02.919817 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerDied","Data":"c66d5e607033edc265c5c4c3b44b5d453515d5500b6db940b367950853043279"} Mar 13 14:25:04 crc kubenswrapper[4898]: I0313 14:25:04.690111 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-2" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerName="rabbitmq" containerID="cri-o://fdd228971531e06c4cfdc0dd4d0052c10c0646d03035ab33629bba605b7a9d8b" gracePeriod=604795 Mar 13 14:25:04 crc kubenswrapper[4898]: I0313 14:25:04.953406 4898 generic.go:334] "Generic (PLEG): container finished" podID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerID="dbe55f5873c440c467d8748cfaa995fee6ccd0abf9441bf1f70ed0dda90073d3" exitCode=0 Mar 13 14:25:04 crc kubenswrapper[4898]: I0313 14:25:04.953482 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerDied","Data":"dbe55f5873c440c467d8748cfaa995fee6ccd0abf9441bf1f70ed0dda90073d3"} Mar 13 14:25:04 crc kubenswrapper[4898]: I0313 14:25:04.953781 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerDied","Data":"efc08f6e1f1c1bae444792fe7fa9bd5076b4f986e80e97128cc5f1ec8235c524"} Mar 13 14:25:04 crc kubenswrapper[4898]: I0313 14:25:04.953797 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efc08f6e1f1c1bae444792fe7fa9bd5076b4f986e80e97128cc5f1ec8235c524" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.031754 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.164163 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-combined-ca-bundle\") pod \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.164399 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-log-httpd\") pod \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.164422 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-sg-core-conf-yaml\") pod \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.164562 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-config-data\") pod \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.164734 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlm6p\" (UniqueName: \"kubernetes.io/projected/b0ac06d2-e2ea-4b4a-8201-83494b53b968-kube-api-access-xlm6p\") pod \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.164782 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-ceilometer-tls-certs\") pod \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.164797 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-run-httpd\") pod \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.164823 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-scripts\") pod \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.171805 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b0ac06d2-e2ea-4b4a-8201-83494b53b968" (UID: "b0ac06d2-e2ea-4b4a-8201-83494b53b968"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.172105 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b0ac06d2-e2ea-4b4a-8201-83494b53b968" (UID: "b0ac06d2-e2ea-4b4a-8201-83494b53b968"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.194022 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-scripts" (OuterVolumeSpecName: "scripts") pod "b0ac06d2-e2ea-4b4a-8201-83494b53b968" (UID: "b0ac06d2-e2ea-4b4a-8201-83494b53b968"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.195440 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ac06d2-e2ea-4b4a-8201-83494b53b968-kube-api-access-xlm6p" (OuterVolumeSpecName: "kube-api-access-xlm6p") pod "b0ac06d2-e2ea-4b4a-8201-83494b53b968" (UID: "b0ac06d2-e2ea-4b4a-8201-83494b53b968"). InnerVolumeSpecName "kube-api-access-xlm6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.267704 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.267731 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlm6p\" (UniqueName: \"kubernetes.io/projected/b0ac06d2-e2ea-4b4a-8201-83494b53b968-kube-api-access-xlm6p\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.267741 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.267749 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.283417 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b0ac06d2-e2ea-4b4a-8201-83494b53b968" (UID: "b0ac06d2-e2ea-4b4a-8201-83494b53b968"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.284453 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b0ac06d2-e2ea-4b4a-8201-83494b53b968" (UID: "b0ac06d2-e2ea-4b4a-8201-83494b53b968"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.305190 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0ac06d2-e2ea-4b4a-8201-83494b53b968" (UID: "b0ac06d2-e2ea-4b4a-8201-83494b53b968"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.352993 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-config-data" (OuterVolumeSpecName: "config-data") pod "b0ac06d2-e2ea-4b4a-8201-83494b53b968" (UID: "b0ac06d2-e2ea-4b4a-8201-83494b53b968"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.370124 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.370175 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.370184 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.370194 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.586198 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="rabbitmq" containerID="cri-o://d377b62f42012aae1789077dde2b4c09f8f770f73f941f01fe11eb21f5b88378" gracePeriod=604796 Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.971305 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.022704 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.084336 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.117312 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:25:06 crc kubenswrapper[4898]: E0313 14:25:06.117874 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="ceilometer-central-agent" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.117892 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="ceilometer-central-agent" Mar 13 14:25:06 crc kubenswrapper[4898]: E0313 14:25:06.117943 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="proxy-httpd" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.117950 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="proxy-httpd" Mar 13 14:25:06 crc kubenswrapper[4898]: E0313 14:25:06.117968 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="ceilometer-notification-agent" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.117974 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="ceilometer-notification-agent" Mar 13 14:25:06 crc kubenswrapper[4898]: E0313 14:25:06.117995 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="sg-core" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.118002 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="sg-core" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.118216 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="proxy-httpd" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.118235 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="ceilometer-central-agent" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.118263 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="sg-core" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.118277 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="ceilometer-notification-agent" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.120744 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.126793 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.127098 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.127412 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.134067 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.200721 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02f7d483-aecb-4a39-babc-6d9598090c4b-run-httpd\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.201235 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.201391 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02f7d483-aecb-4a39-babc-6d9598090c4b-log-httpd\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.201465 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w25qw\" (UniqueName: \"kubernetes.io/projected/02f7d483-aecb-4a39-babc-6d9598090c4b-kube-api-access-w25qw\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.201566 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.201675 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-scripts\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.201755 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-config-data\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.201880 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.304597 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.304680 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02f7d483-aecb-4a39-babc-6d9598090c4b-log-httpd\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.304701 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w25qw\" (UniqueName: \"kubernetes.io/projected/02f7d483-aecb-4a39-babc-6d9598090c4b-kube-api-access-w25qw\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.304747 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.304764 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-scripts\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.304787 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-config-data\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.304814 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.304867 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02f7d483-aecb-4a39-babc-6d9598090c4b-run-httpd\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.305338 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02f7d483-aecb-4a39-babc-6d9598090c4b-run-httpd\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.310052 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.313368 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.316137 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-scripts\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.316395 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02f7d483-aecb-4a39-babc-6d9598090c4b-log-httpd\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.318774 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.337475 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w25qw\" (UniqueName: \"kubernetes.io/projected/02f7d483-aecb-4a39-babc-6d9598090c4b-kube-api-access-w25qw\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.339623 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-config-data\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.449223 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.739468 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:25:06 crc kubenswrapper[4898]: E0313 14:25:06.740155 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.892050 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 13 14:25:07 crc kubenswrapper[4898]: I0313 14:25:07.009736 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:25:07 crc kubenswrapper[4898]: I0313 14:25:07.212469 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 13 14:25:07 crc kubenswrapper[4898]: I0313 14:25:07.753764 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" path="/var/lib/kubelet/pods/b0ac06d2-e2ea-4b4a-8201-83494b53b968/volumes" Mar 13 14:25:08 crc kubenswrapper[4898]: I0313 14:25:08.006657 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02f7d483-aecb-4a39-babc-6d9598090c4b","Type":"ContainerStarted","Data":"a6232eb82b64b66b366d42e19a0e8f84b5b11c39d40c14d2624d531d50080332"} Mar 13 14:25:11 crc kubenswrapper[4898]: I0313 14:25:11.112201 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ee084354-4d32-4d3c-96a4-1e4e7eef5d85","Type":"ContainerDied","Data":"fdd228971531e06c4cfdc0dd4d0052c10c0646d03035ab33629bba605b7a9d8b"} Mar 13 14:25:11 crc kubenswrapper[4898]: I0313 14:25:11.112381 4898 generic.go:334] "Generic (PLEG): container finished" podID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerID="fdd228971531e06c4cfdc0dd4d0052c10c0646d03035ab33629bba605b7a9d8b" exitCode=0 Mar 13 14:25:12 crc kubenswrapper[4898]: I0313 14:25:12.128685 4898 generic.go:334] "Generic (PLEG): container finished" podID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerID="d377b62f42012aae1789077dde2b4c09f8f770f73f941f01fe11eb21f5b88378" exitCode=0 Mar 13 14:25:12 crc kubenswrapper[4898]: I0313 14:25:12.128734 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d56bd826-4f42-409d-ae41-9bfc70d1e038","Type":"ContainerDied","Data":"d377b62f42012aae1789077dde2b4c09f8f770f73f941f01fe11eb21f5b88378"} Mar 13 14:25:13 crc kubenswrapper[4898]: I0313 14:25:13.958308 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-r4rvf"] Mar 13 14:25:13 crc kubenswrapper[4898]: I0313 14:25:13.960874 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:13 crc kubenswrapper[4898]: I0313 14:25:13.965954 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 13 14:25:13 crc kubenswrapper[4898]: I0313 14:25:13.975002 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-r4rvf"] Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.128850 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.129059 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.129133 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs27f\" (UniqueName: \"kubernetes.io/projected/703503be-2f03-4e95-b4ba-ebdd30b717ee-kube-api-access-vs27f\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.129185 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.129257 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-config\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.129296 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.129378 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.231669 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.231733 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs27f\" (UniqueName: \"kubernetes.io/projected/703503be-2f03-4e95-b4ba-ebdd30b717ee-kube-api-access-vs27f\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.231754 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.231798 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-config\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.231818 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.231863 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.231972 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.233341 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.236485 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.241648 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.241871 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.242369 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-config\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.242404 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.258541 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs27f\" (UniqueName: \"kubernetes.io/projected/703503be-2f03-4e95-b4ba-ebdd30b717ee-kube-api-access-vs27f\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.281237 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.502103 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.594504 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-pod-info\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.594656 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-plugins-conf\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.594709 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-plugins\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.594743 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-config-data\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.594776 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-erlang-cookie-secret\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.594874 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-confd\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.594918 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-erlang-cookie\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.594949 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-tls\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.595808 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.595840 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgsn2\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-kube-api-access-kgsn2\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.595867 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-server-conf\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.595975 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.600151 4898 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.601471 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-pod-info" (OuterVolumeSpecName: "pod-info") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.602413 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.606404 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.610402 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.620448 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.623438 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-kube-api-access-kgsn2" (OuterVolumeSpecName: "kube-api-access-kgsn2") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "kube-api-access-kgsn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.698413 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-server-conf" (OuterVolumeSpecName: "server-conf") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.704226 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.704253 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.704291 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgsn2\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-kube-api-access-kgsn2\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.704300 4898 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.704308 4898 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.704316 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.704323 4898 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.792125 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-config-data" (OuterVolumeSpecName: "config-data") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.811718 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.957166 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.962959 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede" (OuterVolumeSpecName: "persistence") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "pvc-10c025d7-e381-4716-bf38-98f5cf86aede". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.016121 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.016183 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") on node \"crc\" " Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.057468 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.057602 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-10c025d7-e381-4716-bf38-98f5cf86aede" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede") on node "crc" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.118702 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.243260 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ee084354-4d32-4d3c-96a4-1e4e7eef5d85","Type":"ContainerDied","Data":"f525c8a2018341f2e27494e3687eb7a4563181188f5faaa51225478656a928a1"} Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.243320 4898 scope.go:117] "RemoveContainer" containerID="fdd228971531e06c4cfdc0dd4d0052c10c0646d03035ab33629bba605b7a9d8b" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.243520 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.291038 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.306237 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.324859 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:25:17 crc kubenswrapper[4898]: E0313 14:25:17.325374 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerName="setup-container" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.325391 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerName="setup-container" Mar 13 14:25:17 crc kubenswrapper[4898]: E0313 14:25:17.325424 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerName="rabbitmq" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.325431 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerName="rabbitmq" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.325657 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerName="rabbitmq" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.327225 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.344982 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426621 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426694 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426777 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426825 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d188301-848c-4cf6-a204-e1110714c1be-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426855 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426898 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d188301-848c-4cf6-a204-e1110714c1be-pod-info\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426948 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426974 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-config-data\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426994 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.427015 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-server-conf\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.427149 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ztgr\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-kube-api-access-9ztgr\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.529631 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d188301-848c-4cf6-a204-e1110714c1be-pod-info\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.529694 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.529720 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-config-data\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.529738 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.529755 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-server-conf\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.529872 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ztgr\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-kube-api-access-9ztgr\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.529935 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.529979 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.530041 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.530079 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d188301-848c-4cf6-a204-e1110714c1be-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.530101 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.530549 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.530970 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.531311 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-config-data\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.531777 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.532068 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-server-conf\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.534022 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.534057 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b5b057b78b5a76d291625b9af6af2e0e662115b1b100b445e2e40d0ac02a65c7/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.534787 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.537580 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d188301-848c-4cf6-a204-e1110714c1be-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.545284 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.546051 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d188301-848c-4cf6-a204-e1110714c1be-pod-info\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.549651 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ztgr\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-kube-api-access-9ztgr\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.601258 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.655259 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.754764 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" path="/var/lib/kubelet/pods/ee084354-4d32-4d3c-96a4-1e4e7eef5d85/volumes" Mar 13 14:25:21 crc kubenswrapper[4898]: I0313 14:25:21.739404 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:25:21 crc kubenswrapper[4898]: E0313 14:25:21.740386 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:25:21 crc kubenswrapper[4898]: I0313 14:25:21.892166 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: i/o timeout" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.581342 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.698574 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-config-data\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.704043 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-plugins-conf\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.704119 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-plugins\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.704243 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-confd\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.704283 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-tls\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.705111 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.705167 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-server-conf\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.705227 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d56bd826-4f42-409d-ae41-9bfc70d1e038-erlang-cookie-secret\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.705267 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d56bd826-4f42-409d-ae41-9bfc70d1e038-pod-info\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.705293 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-erlang-cookie\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.705362 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk5mq\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-kube-api-access-xk5mq\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.705897 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.706198 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.706727 4898 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.706755 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.715076 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d56bd826-4f42-409d-ae41-9bfc70d1e038-pod-info" (OuterVolumeSpecName: "pod-info") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.718014 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d56bd826-4f42-409d-ae41-9bfc70d1e038-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.726337 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-kube-api-access-xk5mq" (OuterVolumeSpecName: "kube-api-access-xk5mq") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "kube-api-access-xk5mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.726385 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.740586 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.748169 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d" (OuterVolumeSpecName: "persistence") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.780647 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-config-data" (OuterVolumeSpecName: "config-data") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.809058 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk5mq\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-kube-api-access-xk5mq\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.809099 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.809111 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.809146 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") on node \"crc\" " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.809162 4898 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d56bd826-4f42-409d-ae41-9bfc70d1e038-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.809176 4898 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d56bd826-4f42-409d-ae41-9bfc70d1e038-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.809189 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.822583 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-server-conf" (OuterVolumeSpecName: "server-conf") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.875843 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.875998 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d") on node "crc" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.901512 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.910913 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.910945 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.910959 4898 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:24 crc kubenswrapper[4898]: E0313 14:25:24.297035 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 13 14:25:24 crc kubenswrapper[4898]: E0313 14:25:24.297319 4898 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 13 14:25:24 crc kubenswrapper[4898]: E0313 14:25:24.297459 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z25rd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-kxtcf_openstack(2cd78a2a-1bb4-461a-92cd-d705080b087a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:25:24 crc kubenswrapper[4898]: E0313 14:25:24.298663 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-kxtcf" podUID="2cd78a2a-1bb4-461a-92cd-d705080b087a" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.348788 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d56bd826-4f42-409d-ae41-9bfc70d1e038","Type":"ContainerDied","Data":"8cd6b4a73f7f67c36783e2cd3de871dd93389c4f889e74a44de4a7253a7e9a9c"} Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.348843 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.367270 4898 scope.go:117] "RemoveContainer" containerID="319d11416db34d4c2bde21b35bf9b79fc6c55b22cfe14271a9be5dde11f3c078" Mar 13 14:25:24 crc kubenswrapper[4898]: E0313 14:25:24.371740 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-kxtcf" podUID="2cd78a2a-1bb4-461a-92cd-d705080b087a" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.411064 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.440926 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.459897 4898 scope.go:117] "RemoveContainer" containerID="d377b62f42012aae1789077dde2b4c09f8f770f73f941f01fe11eb21f5b88378" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.469690 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:25:24 crc kubenswrapper[4898]: E0313 14:25:24.470595 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="setup-container" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.470750 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="setup-container" Mar 13 14:25:24 crc kubenswrapper[4898]: E0313 14:25:24.470801 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="rabbitmq" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.470809 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="rabbitmq" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.471074 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="rabbitmq" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.472386 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.475848 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.476731 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.477113 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.477183 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.477121 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.477315 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4m6nk" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.477512 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.482756 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636217 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636585 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636645 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636673 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636745 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636798 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636828 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636887 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636951 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.637084 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.637114 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f42nr\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-kube-api-access-f42nr\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.728631 4898 scope.go:117] "RemoveContainer" containerID="cb002d235371a7e7beebe07dd448307d31c6dae66e8fbd1dd6c0c499e634cca9" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741287 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741448 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741490 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f42nr\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-kube-api-access-f42nr\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741590 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741654 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741701 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741728 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741792 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741840 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741873 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741970 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.746494 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.747372 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.747509 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3111f327615e010747f22a13f9378eff3b7d96c403da97ea4361402b1c85d196/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.765855 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.766134 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.771103 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.773585 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.783702 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.783923 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.785609 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f42nr\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-kube-api-access-f42nr\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.785806 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.786045 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.823685 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.863312 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.978838 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-r4rvf"] Mar 13 14:25:25 crc kubenswrapper[4898]: I0313 14:25:25.007732 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:25:25 crc kubenswrapper[4898]: I0313 14:25:25.376114 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:25:25 crc kubenswrapper[4898]: I0313 14:25:25.376493 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"8d188301-848c-4cf6-a204-e1110714c1be","Type":"ContainerStarted","Data":"d6acba7cd4117378d7a97b387783250322356fae61cc48751a89151539d61d29"} Mar 13 14:25:25 crc kubenswrapper[4898]: I0313 14:25:25.378292 4898 generic.go:334] "Generic (PLEG): container finished" podID="703503be-2f03-4e95-b4ba-ebdd30b717ee" containerID="ab7dee171473df88511004b0f9cd06f3de427bbb59dba778bc1dbd3f8e29abeb" exitCode=0 Mar 13 14:25:25 crc kubenswrapper[4898]: I0313 14:25:25.378328 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" event={"ID":"703503be-2f03-4e95-b4ba-ebdd30b717ee","Type":"ContainerDied","Data":"ab7dee171473df88511004b0f9cd06f3de427bbb59dba778bc1dbd3f8e29abeb"} Mar 13 14:25:25 crc kubenswrapper[4898]: I0313 14:25:25.378367 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" event={"ID":"703503be-2f03-4e95-b4ba-ebdd30b717ee","Type":"ContainerStarted","Data":"156e4c323450d64b57cc91d4cd576fcfcc3344435ba7b3b650ea24a1251763ee"} Mar 13 14:25:25 crc kubenswrapper[4898]: I0313 14:25:25.384650 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02f7d483-aecb-4a39-babc-6d9598090c4b","Type":"ContainerStarted","Data":"7ce3438fd9d4e3db5da0a65bc2744dcbc537f4c6c98b27b1433e8ed964fd1ed3"} Mar 13 14:25:25 crc kubenswrapper[4898]: I0313 14:25:25.759954 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" path="/var/lib/kubelet/pods/d56bd826-4f42-409d-ae41-9bfc70d1e038/volumes" Mar 13 14:25:26 crc kubenswrapper[4898]: I0313 14:25:26.399582 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835","Type":"ContainerStarted","Data":"e8857c4721e5f8ce1de5ec7a35488e4664a881af5e5f3ad6d2772e453cd83c85"} Mar 13 14:25:26 crc kubenswrapper[4898]: I0313 14:25:26.402025 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" event={"ID":"703503be-2f03-4e95-b4ba-ebdd30b717ee","Type":"ContainerStarted","Data":"0b355d36acba975b937e3513e16f6c9b056100e8a7cb8f4e84d596f542de18b5"} Mar 13 14:25:26 crc kubenswrapper[4898]: I0313 14:25:26.402229 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:26 crc kubenswrapper[4898]: I0313 14:25:26.404115 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02f7d483-aecb-4a39-babc-6d9598090c4b","Type":"ContainerStarted","Data":"b4732cf2586a8c63bfd4f4a4eb216ad6c43d632d96ee4b66b2126f4895cf7dd1"} Mar 13 14:25:26 crc kubenswrapper[4898]: I0313 14:25:26.426953 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" podStartSLOduration=13.426926024 podStartE2EDuration="13.426926024s" podCreationTimestamp="2026-03-13 14:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:25:26.420845365 +0000 UTC m=+1761.422433654" watchObservedRunningTime="2026-03-13 14:25:26.426926024 +0000 UTC m=+1761.428514263" Mar 13 14:25:27 crc kubenswrapper[4898]: I0313 14:25:27.416633 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"8d188301-848c-4cf6-a204-e1110714c1be","Type":"ContainerStarted","Data":"0d8797262833812626f4d3e0e1db3d064a9feac6dd8c4aab149c760269a9a573"} Mar 13 14:25:27 crc kubenswrapper[4898]: I0313 14:25:27.420090 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02f7d483-aecb-4a39-babc-6d9598090c4b","Type":"ContainerStarted","Data":"20a7d66d4eadaac13a3d1530dfffc36cf690cb176befcda52b573d0e1cd9e142"} Mar 13 14:25:27 crc kubenswrapper[4898]: I0313 14:25:27.423470 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835","Type":"ContainerStarted","Data":"6ac994a64cbced8d5ed2ad37e427a3eeb5d4669d67bcb7a943f6946233be58c4"} Mar 13 14:25:29 crc kubenswrapper[4898]: I0313 14:25:29.453285 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02f7d483-aecb-4a39-babc-6d9598090c4b","Type":"ContainerStarted","Data":"b4437a37a66416890e0b218d39962696089d044b5aa8cf8e7b428a548d5a2914"} Mar 13 14:25:29 crc kubenswrapper[4898]: I0313 14:25:29.454030 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:25:29 crc kubenswrapper[4898]: I0313 14:25:29.484822 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.067253541 podStartE2EDuration="23.484791785s" podCreationTimestamp="2026-03-13 14:25:06 +0000 UTC" firstStartedPulling="2026-03-13 14:25:07.017676491 +0000 UTC m=+1742.019264740" lastFinishedPulling="2026-03-13 14:25:28.435214735 +0000 UTC m=+1763.436802984" observedRunningTime="2026-03-13 14:25:29.478071278 +0000 UTC m=+1764.479659607" watchObservedRunningTime="2026-03-13 14:25:29.484791785 +0000 UTC m=+1764.486380064" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.283044 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.377850 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-bsswg"] Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.378115 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" podUID="c1d23b78-5402-47e0-8af6-851fcc71be6b" containerName="dnsmasq-dns" containerID="cri-o://4e87cc2a9eb5ac3a94e1731b9ead6b0d6cdba2ef55c6b43916f80ca58fe1a32b" gracePeriod=10 Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.556323 4898 generic.go:334] "Generic (PLEG): container finished" podID="c1d23b78-5402-47e0-8af6-851fcc71be6b" containerID="4e87cc2a9eb5ac3a94e1731b9ead6b0d6cdba2ef55c6b43916f80ca58fe1a32b" exitCode=0 Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.556375 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" event={"ID":"c1d23b78-5402-47e0-8af6-851fcc71be6b","Type":"ContainerDied","Data":"4e87cc2a9eb5ac3a94e1731b9ead6b0d6cdba2ef55c6b43916f80ca58fe1a32b"} Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.611957 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-k4ntr"] Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.614039 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.631625 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-k4ntr"] Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.735811 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjhl4\" (UniqueName: \"kubernetes.io/projected/dd51a575-1651-4891-941f-3e0fe447e81d-kube-api-access-hjhl4\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.736116 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-config\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.736147 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.736239 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.736281 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.736307 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.736412 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.840861 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjhl4\" (UniqueName: \"kubernetes.io/projected/dd51a575-1651-4891-941f-3e0fe447e81d-kube-api-access-hjhl4\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.841308 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-config\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.841331 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.841399 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.841439 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.841456 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.841524 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.842336 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.843076 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.843265 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.843761 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-config\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.845378 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.845474 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.874532 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjhl4\" (UniqueName: \"kubernetes.io/projected/dd51a575-1651-4891-941f-3e0fe447e81d-kube-api-access-hjhl4\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.942594 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.299478 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.393990 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg4fs\" (UniqueName: \"kubernetes.io/projected/c1d23b78-5402-47e0-8af6-851fcc71be6b-kube-api-access-zg4fs\") pod \"c1d23b78-5402-47e0-8af6-851fcc71be6b\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.394080 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-config\") pod \"c1d23b78-5402-47e0-8af6-851fcc71be6b\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.394147 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-svc\") pod \"c1d23b78-5402-47e0-8af6-851fcc71be6b\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.394272 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-swift-storage-0\") pod \"c1d23b78-5402-47e0-8af6-851fcc71be6b\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.394397 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-sb\") pod \"c1d23b78-5402-47e0-8af6-851fcc71be6b\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.394461 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-nb\") pod \"c1d23b78-5402-47e0-8af6-851fcc71be6b\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.419229 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1d23b78-5402-47e0-8af6-851fcc71be6b-kube-api-access-zg4fs" (OuterVolumeSpecName: "kube-api-access-zg4fs") pod "c1d23b78-5402-47e0-8af6-851fcc71be6b" (UID: "c1d23b78-5402-47e0-8af6-851fcc71be6b"). InnerVolumeSpecName "kube-api-access-zg4fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.495085 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c1d23b78-5402-47e0-8af6-851fcc71be6b" (UID: "c1d23b78-5402-47e0-8af6-851fcc71be6b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.496358 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-sb\") pod \"c1d23b78-5402-47e0-8af6-851fcc71be6b\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " Mar 13 14:25:35 crc kubenswrapper[4898]: W0313 14:25:35.496518 4898 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/c1d23b78-5402-47e0-8af6-851fcc71be6b/volumes/kubernetes.io~configmap/ovsdbserver-sb Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.496552 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c1d23b78-5402-47e0-8af6-851fcc71be6b" (UID: "c1d23b78-5402-47e0-8af6-851fcc71be6b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.497254 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.497276 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg4fs\" (UniqueName: \"kubernetes.io/projected/c1d23b78-5402-47e0-8af6-851fcc71be6b-kube-api-access-zg4fs\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.497751 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c1d23b78-5402-47e0-8af6-851fcc71be6b" (UID: "c1d23b78-5402-47e0-8af6-851fcc71be6b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.502324 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c1d23b78-5402-47e0-8af6-851fcc71be6b" (UID: "c1d23b78-5402-47e0-8af6-851fcc71be6b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.503602 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c1d23b78-5402-47e0-8af6-851fcc71be6b" (UID: "c1d23b78-5402-47e0-8af6-851fcc71be6b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.519022 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-config" (OuterVolumeSpecName: "config") pod "c1d23b78-5402-47e0-8af6-851fcc71be6b" (UID: "c1d23b78-5402-47e0-8af6-851fcc71be6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.568852 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" event={"ID":"c1d23b78-5402-47e0-8af6-851fcc71be6b","Type":"ContainerDied","Data":"0c1b795a0a1b5e26820a1d2e36ef2d08e255211a4eb9dbe6c50123bc0c988977"} Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.568923 4898 scope.go:117] "RemoveContainer" containerID="4e87cc2a9eb5ac3a94e1731b9ead6b0d6cdba2ef55c6b43916f80ca58fe1a32b" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.568966 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.600074 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.600107 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.600118 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.600131 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.647407 4898 scope.go:117] "RemoveContainer" containerID="7bda3f3e696b51ddd844a86367f80dfd5aee6675d49dc2996a124d70519d4d20" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.648430 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-bsswg"] Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.660765 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-bsswg"] Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.703019 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-k4ntr"] Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.769108 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1d23b78-5402-47e0-8af6-851fcc71be6b" path="/var/lib/kubelet/pods/c1d23b78-5402-47e0-8af6-851fcc71be6b/volumes" Mar 13 14:25:36 crc kubenswrapper[4898]: I0313 14:25:36.584770 4898 generic.go:334] "Generic (PLEG): container finished" podID="dd51a575-1651-4891-941f-3e0fe447e81d" containerID="30c39b4868287edbf1ca5987a81c4b7f0ae4a1a7b5fc27bf801a7c4d60e0b3b8" exitCode=0 Mar 13 14:25:36 crc kubenswrapper[4898]: I0313 14:25:36.584826 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" event={"ID":"dd51a575-1651-4891-941f-3e0fe447e81d","Type":"ContainerDied","Data":"30c39b4868287edbf1ca5987a81c4b7f0ae4a1a7b5fc27bf801a7c4d60e0b3b8"} Mar 13 14:25:36 crc kubenswrapper[4898]: I0313 14:25:36.585309 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" event={"ID":"dd51a575-1651-4891-941f-3e0fe447e81d","Type":"ContainerStarted","Data":"7bc470273f1aa147ee917555d3600cdc1fb0566974c6cb84da3c33025400b8dd"} Mar 13 14:25:36 crc kubenswrapper[4898]: I0313 14:25:36.739995 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:25:36 crc kubenswrapper[4898]: E0313 14:25:36.741539 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:25:37 crc kubenswrapper[4898]: I0313 14:25:37.598344 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" event={"ID":"dd51a575-1651-4891-941f-3e0fe447e81d","Type":"ContainerStarted","Data":"b16ba020ad81204120e532385217b9c5822d096a3364d32b3184ff5caa745829"} Mar 13 14:25:37 crc kubenswrapper[4898]: I0313 14:25:37.598590 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:37 crc kubenswrapper[4898]: I0313 14:25:37.600966 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-kxtcf" event={"ID":"2cd78a2a-1bb4-461a-92cd-d705080b087a","Type":"ContainerStarted","Data":"797b7877d34540edd204a0c5f49e93f47ceac114fc9a4ba968964ef3cae04ffa"} Mar 13 14:25:37 crc kubenswrapper[4898]: I0313 14:25:37.648504 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" podStartSLOduration=3.64848231 podStartE2EDuration="3.64848231s" podCreationTimestamp="2026-03-13 14:25:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:25:37.636936806 +0000 UTC m=+1772.638525045" watchObservedRunningTime="2026-03-13 14:25:37.64848231 +0000 UTC m=+1772.650070549" Mar 13 14:25:37 crc kubenswrapper[4898]: I0313 14:25:37.670039 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-kxtcf" podStartSLOduration=2.105986108 podStartE2EDuration="40.670023916s" podCreationTimestamp="2026-03-13 14:24:57 +0000 UTC" firstStartedPulling="2026-03-13 14:24:58.37560807 +0000 UTC m=+1733.377196309" lastFinishedPulling="2026-03-13 14:25:36.939645878 +0000 UTC m=+1771.941234117" observedRunningTime="2026-03-13 14:25:37.662443487 +0000 UTC m=+1772.664031736" watchObservedRunningTime="2026-03-13 14:25:37.670023916 +0000 UTC m=+1772.671612155" Mar 13 14:25:39 crc kubenswrapper[4898]: I0313 14:25:39.629618 4898 generic.go:334] "Generic (PLEG): container finished" podID="2cd78a2a-1bb4-461a-92cd-d705080b087a" containerID="797b7877d34540edd204a0c5f49e93f47ceac114fc9a4ba968964ef3cae04ffa" exitCode=0 Mar 13 14:25:39 crc kubenswrapper[4898]: I0313 14:25:39.629689 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-kxtcf" event={"ID":"2cd78a2a-1bb4-461a-92cd-d705080b087a","Type":"ContainerDied","Data":"797b7877d34540edd204a0c5f49e93f47ceac114fc9a4ba968964ef3cae04ffa"} Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.144860 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-kxtcf" Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.248777 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z25rd\" (UniqueName: \"kubernetes.io/projected/2cd78a2a-1bb4-461a-92cd-d705080b087a-kube-api-access-z25rd\") pod \"2cd78a2a-1bb4-461a-92cd-d705080b087a\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.249013 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-combined-ca-bundle\") pod \"2cd78a2a-1bb4-461a-92cd-d705080b087a\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.249042 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-config-data\") pod \"2cd78a2a-1bb4-461a-92cd-d705080b087a\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.255328 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd78a2a-1bb4-461a-92cd-d705080b087a-kube-api-access-z25rd" (OuterVolumeSpecName: "kube-api-access-z25rd") pod "2cd78a2a-1bb4-461a-92cd-d705080b087a" (UID: "2cd78a2a-1bb4-461a-92cd-d705080b087a"). InnerVolumeSpecName "kube-api-access-z25rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.285681 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cd78a2a-1bb4-461a-92cd-d705080b087a" (UID: "2cd78a2a-1bb4-461a-92cd-d705080b087a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.349874 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-config-data" (OuterVolumeSpecName: "config-data") pod "2cd78a2a-1bb4-461a-92cd-d705080b087a" (UID: "2cd78a2a-1bb4-461a-92cd-d705080b087a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.352184 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z25rd\" (UniqueName: \"kubernetes.io/projected/2cd78a2a-1bb4-461a-92cd-d705080b087a-kube-api-access-z25rd\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.352221 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.352235 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.657762 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-kxtcf" event={"ID":"2cd78a2a-1bb4-461a-92cd-d705080b087a","Type":"ContainerDied","Data":"f0ef8052f16886ece221ecf56528cf884da231c4fa187db604454c6c5925f956"} Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.658141 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0ef8052f16886ece221ecf56528cf884da231c4fa187db604454c6c5925f956" Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.658065 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-kxtcf" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.705718 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6b446d7755-5724r"] Mar 13 14:25:42 crc kubenswrapper[4898]: E0313 14:25:42.706350 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d23b78-5402-47e0-8af6-851fcc71be6b" containerName="dnsmasq-dns" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.706367 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d23b78-5402-47e0-8af6-851fcc71be6b" containerName="dnsmasq-dns" Mar 13 14:25:42 crc kubenswrapper[4898]: E0313 14:25:42.706402 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd78a2a-1bb4-461a-92cd-d705080b087a" containerName="heat-db-sync" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.706414 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd78a2a-1bb4-461a-92cd-d705080b087a" containerName="heat-db-sync" Mar 13 14:25:42 crc kubenswrapper[4898]: E0313 14:25:42.706434 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d23b78-5402-47e0-8af6-851fcc71be6b" containerName="init" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.706446 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d23b78-5402-47e0-8af6-851fcc71be6b" containerName="init" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.706788 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd78a2a-1bb4-461a-92cd-d705080b087a" containerName="heat-db-sync" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.706817 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1d23b78-5402-47e0-8af6-851fcc71be6b" containerName="dnsmasq-dns" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.707916 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.738862 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6b446d7755-5724r"] Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.789880 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-648cbb8b5f-4kb5b"] Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.791964 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.807924 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-648cbb8b5f-4kb5b"] Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.828513 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5df9b5999-7tt4b"] Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.830432 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.848093 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5df9b5999-7tt4b"] Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.894579 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-public-tls-certs\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.894653 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwzqp\" (UniqueName: \"kubernetes.io/projected/0f20ec1d-823e-4695-859e-bdc538e602d9-kube-api-access-fwzqp\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.894708 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-combined-ca-bundle\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.894745 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-config-data\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.894786 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-config-data\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.894801 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-combined-ca-bundle\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.894891 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-config-data-custom\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.894981 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-internal-tls-certs\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.895021 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrjvx\" (UniqueName: \"kubernetes.io/projected/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-kube-api-access-jrjvx\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.895114 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-config-data-custom\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.997515 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwh8g\" (UniqueName: \"kubernetes.io/projected/03c552ae-5860-4468-a612-7af3d3587df4-kube-api-access-xwh8g\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.997586 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-config-data-custom\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.997761 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-internal-tls-certs\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.997854 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-internal-tls-certs\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.997943 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrjvx\" (UniqueName: \"kubernetes.io/projected/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-kube-api-access-jrjvx\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998034 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-config-data\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998154 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-public-tls-certs\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998194 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-config-data-custom\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998322 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-public-tls-certs\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998438 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwzqp\" (UniqueName: \"kubernetes.io/projected/0f20ec1d-823e-4695-859e-bdc538e602d9-kube-api-access-fwzqp\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998474 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-combined-ca-bundle\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998604 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-combined-ca-bundle\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998643 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-config-data-custom\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998788 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-config-data\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998888 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-config-data\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998924 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-combined-ca-bundle\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.003311 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-config-data\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.003679 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-public-tls-certs\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.003800 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-config-data-custom\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.004032 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-config-data-custom\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.004835 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-combined-ca-bundle\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.005524 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-config-data\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.006554 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-combined-ca-bundle\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.007106 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-internal-tls-certs\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.015486 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrjvx\" (UniqueName: \"kubernetes.io/projected/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-kube-api-access-jrjvx\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.019328 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwzqp\" (UniqueName: \"kubernetes.io/projected/0f20ec1d-823e-4695-859e-bdc538e602d9-kube-api-access-fwzqp\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.047161 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.100519 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-combined-ca-bundle\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.100574 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-config-data-custom\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.100646 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwh8g\" (UniqueName: \"kubernetes.io/projected/03c552ae-5860-4468-a612-7af3d3587df4-kube-api-access-xwh8g\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.100710 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-internal-tls-certs\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.100749 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-config-data\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.100778 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-public-tls-certs\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.110313 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.110989 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-public-tls-certs\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.123512 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-internal-tls-certs\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.128981 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-combined-ca-bundle\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.128986 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-config-data\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.131117 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-config-data-custom\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.134129 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwh8g\" (UniqueName: \"kubernetes.io/projected/03c552ae-5860-4468-a612-7af3d3587df4-kube-api-access-xwh8g\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.148255 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.634201 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6b446d7755-5724r"] Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.696098 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6b446d7755-5724r" event={"ID":"0f20ec1d-823e-4695-859e-bdc538e602d9","Type":"ContainerStarted","Data":"751319ce2a8b58f9c1a11c8fdc9a52a3f25980adb3450d0601d1a75bc662b822"} Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.781789 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-648cbb8b5f-4kb5b"] Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.783144 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.796956 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5df9b5999-7tt4b"] Mar 13 14:25:44 crc kubenswrapper[4898]: I0313 14:25:44.719453 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-648cbb8b5f-4kb5b" event={"ID":"739e9c4a-9843-4edf-a045-2f7ef8d15b5e","Type":"ContainerStarted","Data":"18c3266dabf2b505cbe40a224487d5d8f44b40eb5270476d38970908a109e756"} Mar 13 14:25:44 crc kubenswrapper[4898]: I0313 14:25:44.721244 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5df9b5999-7tt4b" event={"ID":"03c552ae-5860-4468-a612-7af3d3587df4","Type":"ContainerStarted","Data":"1c2cf2943d5fd39baea4d24bd14b6f6875fc9986a3f0e1ebc3fe279a301f3d4d"} Mar 13 14:25:44 crc kubenswrapper[4898]: I0313 14:25:44.723041 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6b446d7755-5724r" event={"ID":"0f20ec1d-823e-4695-859e-bdc538e602d9","Type":"ContainerStarted","Data":"e8252fcff7a9c8f96c118fb65e192dfd92a0dc9ac03dbb932eea7baa1e6ec887"} Mar 13 14:25:44 crc kubenswrapper[4898]: I0313 14:25:44.723206 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:44 crc kubenswrapper[4898]: I0313 14:25:44.945724 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:44 crc kubenswrapper[4898]: I0313 14:25:44.967250 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6b446d7755-5724r" podStartSLOduration=2.967230565 podStartE2EDuration="2.967230565s" podCreationTimestamp="2026-03-13 14:25:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:25:44.746436941 +0000 UTC m=+1779.748025210" watchObservedRunningTime="2026-03-13 14:25:44.967230565 +0000 UTC m=+1779.968818804" Mar 13 14:25:45 crc kubenswrapper[4898]: I0313 14:25:45.067014 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-r4rvf"] Mar 13 14:25:45 crc kubenswrapper[4898]: I0313 14:25:45.067349 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" podUID="703503be-2f03-4e95-b4ba-ebdd30b717ee" containerName="dnsmasq-dns" containerID="cri-o://0b355d36acba975b937e3513e16f6c9b056100e8a7cb8f4e84d596f542de18b5" gracePeriod=10 Mar 13 14:25:45 crc kubenswrapper[4898]: I0313 14:25:45.738397 4898 generic.go:334] "Generic (PLEG): container finished" podID="703503be-2f03-4e95-b4ba-ebdd30b717ee" containerID="0b355d36acba975b937e3513e16f6c9b056100e8a7cb8f4e84d596f542de18b5" exitCode=0 Mar 13 14:25:45 crc kubenswrapper[4898]: I0313 14:25:45.738415 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" event={"ID":"703503be-2f03-4e95-b4ba-ebdd30b717ee","Type":"ContainerDied","Data":"0b355d36acba975b937e3513e16f6c9b056100e8a7cb8f4e84d596f542de18b5"} Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.619309 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.710169 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-nb\") pod \"703503be-2f03-4e95-b4ba-ebdd30b717ee\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.710479 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-openstack-edpm-ipam\") pod \"703503be-2f03-4e95-b4ba-ebdd30b717ee\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.710547 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-swift-storage-0\") pod \"703503be-2f03-4e95-b4ba-ebdd30b717ee\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.710666 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-svc\") pod \"703503be-2f03-4e95-b4ba-ebdd30b717ee\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.710699 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-sb\") pod \"703503be-2f03-4e95-b4ba-ebdd30b717ee\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.710821 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs27f\" (UniqueName: \"kubernetes.io/projected/703503be-2f03-4e95-b4ba-ebdd30b717ee-kube-api-access-vs27f\") pod \"703503be-2f03-4e95-b4ba-ebdd30b717ee\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.710977 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-config\") pod \"703503be-2f03-4e95-b4ba-ebdd30b717ee\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.721401 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/703503be-2f03-4e95-b4ba-ebdd30b717ee-kube-api-access-vs27f" (OuterVolumeSpecName: "kube-api-access-vs27f") pod "703503be-2f03-4e95-b4ba-ebdd30b717ee" (UID: "703503be-2f03-4e95-b4ba-ebdd30b717ee"). InnerVolumeSpecName "kube-api-access-vs27f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.782941 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" event={"ID":"703503be-2f03-4e95-b4ba-ebdd30b717ee","Type":"ContainerDied","Data":"156e4c323450d64b57cc91d4cd576fcfcc3344435ba7b3b650ea24a1251763ee"} Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.782994 4898 scope.go:117] "RemoveContainer" containerID="0b355d36acba975b937e3513e16f6c9b056100e8a7cb8f4e84d596f542de18b5" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.783144 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.792698 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.815204 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs27f\" (UniqueName: \"kubernetes.io/projected/703503be-2f03-4e95-b4ba-ebdd30b717ee-kube-api-access-vs27f\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.858098 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "703503be-2f03-4e95-b4ba-ebdd30b717ee" (UID: "703503be-2f03-4e95-b4ba-ebdd30b717ee"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.861543 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-648cbb8b5f-4kb5b" podStartSLOduration=2.252938428 podStartE2EDuration="4.86152689s" podCreationTimestamp="2026-03-13 14:25:42 +0000 UTC" firstStartedPulling="2026-03-13 14:25:43.782892333 +0000 UTC m=+1778.784480572" lastFinishedPulling="2026-03-13 14:25:46.391480755 +0000 UTC m=+1781.393069034" observedRunningTime="2026-03-13 14:25:46.840710053 +0000 UTC m=+1781.842298292" watchObservedRunningTime="2026-03-13 14:25:46.86152689 +0000 UTC m=+1781.863115129" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.875361 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "703503be-2f03-4e95-b4ba-ebdd30b717ee" (UID: "703503be-2f03-4e95-b4ba-ebdd30b717ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.875880 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-config" (OuterVolumeSpecName: "config") pod "703503be-2f03-4e95-b4ba-ebdd30b717ee" (UID: "703503be-2f03-4e95-b4ba-ebdd30b717ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.879524 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "703503be-2f03-4e95-b4ba-ebdd30b717ee" (UID: "703503be-2f03-4e95-b4ba-ebdd30b717ee"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.888938 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "703503be-2f03-4e95-b4ba-ebdd30b717ee" (UID: "703503be-2f03-4e95-b4ba-ebdd30b717ee"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.921507 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.921539 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.921579 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.921592 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.921603 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.949114 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "703503be-2f03-4e95-b4ba-ebdd30b717ee" (UID: "703503be-2f03-4e95-b4ba-ebdd30b717ee"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.023916 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.055085 4898 scope.go:117] "RemoveContainer" containerID="ab7dee171473df88511004b0f9cd06f3de427bbb59dba778bc1dbd3f8e29abeb" Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.117524 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-r4rvf"] Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.149233 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-r4rvf"] Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.756711 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="703503be-2f03-4e95-b4ba-ebdd30b717ee" path="/var/lib/kubelet/pods/703503be-2f03-4e95-b4ba-ebdd30b717ee/volumes" Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.809462 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-648cbb8b5f-4kb5b" event={"ID":"739e9c4a-9843-4edf-a045-2f7ef8d15b5e","Type":"ContainerStarted","Data":"02515cf8c295d1ea1e4a384e101d892c550963757aae50ec9210b86c71be25a1"} Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.811833 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5df9b5999-7tt4b" event={"ID":"03c552ae-5860-4468-a612-7af3d3587df4","Type":"ContainerStarted","Data":"e35296bb12f025606d86ea55beb23f4933c6b981cfa16d4b258e576d830c3ddd"} Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.812111 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.851383 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5df9b5999-7tt4b" podStartSLOduration=3.254786594 podStartE2EDuration="5.85135465s" podCreationTimestamp="2026-03-13 14:25:42 +0000 UTC" firstStartedPulling="2026-03-13 14:25:43.796788998 +0000 UTC m=+1778.798377237" lastFinishedPulling="2026-03-13 14:25:46.393357054 +0000 UTC m=+1781.394945293" observedRunningTime="2026-03-13 14:25:47.837627829 +0000 UTC m=+1782.839216068" watchObservedRunningTime="2026-03-13 14:25:47.85135465 +0000 UTC m=+1782.852942929" Mar 13 14:25:48 crc kubenswrapper[4898]: I0313 14:25:48.739399 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:25:48 crc kubenswrapper[4898]: E0313 14:25:48.740130 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:25:53 crc kubenswrapper[4898]: I0313 14:25:53.116507 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:53 crc kubenswrapper[4898]: I0313 14:25:53.202096 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5b6c75676b-jx6kl"] Mar 13 14:25:53 crc kubenswrapper[4898]: I0313 14:25:53.202465 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-5b6c75676b-jx6kl" podUID="ad94280e-6f02-4129-9cdc-c35499f5d5e4" containerName="heat-engine" containerID="cri-o://75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" gracePeriod=60 Mar 13 14:25:54 crc kubenswrapper[4898]: I0313 14:25:54.897164 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:54 crc kubenswrapper[4898]: I0313 14:25:54.986828 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5f97b49ff6-67dbr"] Mar 13 14:25:54 crc kubenswrapper[4898]: I0313 14:25:54.987140 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-5f97b49ff6-67dbr" podUID="0a9180e2-91e9-4063-83a5-5b4ba75ca011" containerName="heat-api" containerID="cri-o://357d95fc5d8e0d7678f3ea6e54b764f618b159ca6eaa129aa3499f515c52ea42" gracePeriod=60 Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.156681 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs"] Mar 13 14:25:55 crc kubenswrapper[4898]: E0313 14:25:55.157263 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703503be-2f03-4e95-b4ba-ebdd30b717ee" containerName="init" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.157281 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="703503be-2f03-4e95-b4ba-ebdd30b717ee" containerName="init" Mar 13 14:25:55 crc kubenswrapper[4898]: E0313 14:25:55.157308 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703503be-2f03-4e95-b4ba-ebdd30b717ee" containerName="dnsmasq-dns" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.157315 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="703503be-2f03-4e95-b4ba-ebdd30b717ee" containerName="dnsmasq-dns" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.157542 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="703503be-2f03-4e95-b4ba-ebdd30b717ee" containerName="dnsmasq-dns" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.158419 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.161057 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.167284 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.167491 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.167491 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.182667 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs"] Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.238913 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.239415 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ff55\" (UniqueName: \"kubernetes.io/projected/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-kube-api-access-7ff55\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.239582 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.239663 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.341647 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.341751 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.341788 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ff55\" (UniqueName: \"kubernetes.io/projected/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-kube-api-access-7ff55\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.342293 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.349524 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.353492 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.353954 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.363155 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ff55\" (UniqueName: \"kubernetes.io/projected/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-kube-api-access-7ff55\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.491992 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.624628 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.707959 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-76b5758c54-vpp67"] Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.708437 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-76b5758c54-vpp67" podUID="bd18ec2e-1196-4e66-a1c5-9e3daefd7171" containerName="heat-cfnapi" containerID="cri-o://458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e" gracePeriod=60 Mar 13 14:25:56 crc kubenswrapper[4898]: I0313 14:25:56.251685 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs"] Mar 13 14:25:56 crc kubenswrapper[4898]: W0313 14:25:56.255005 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98336335_4b60_4ddf_8fe8_4ea6b69d47ef.slice/crio-6844f40cb1716e51fac4b5bf0efd698139e685b8b7f6778ffb18ee6dd869ba69 WatchSource:0}: Error finding container 6844f40cb1716e51fac4b5bf0efd698139e685b8b7f6778ffb18ee6dd869ba69: Status 404 returned error can't find the container with id 6844f40cb1716e51fac4b5bf0efd698139e685b8b7f6778ffb18ee6dd869ba69 Mar 13 14:25:56 crc kubenswrapper[4898]: E0313 14:25:56.814095 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:25:56 crc kubenswrapper[4898]: E0313 14:25:56.816295 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:25:56 crc kubenswrapper[4898]: E0313 14:25:56.817935 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:25:56 crc kubenswrapper[4898]: E0313 14:25:56.817984 4898 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5b6c75676b-jx6kl" podUID="ad94280e-6f02-4129-9cdc-c35499f5d5e4" containerName="heat-engine" Mar 13 14:25:56 crc kubenswrapper[4898]: I0313 14:25:56.937230 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" event={"ID":"98336335-4b60-4ddf-8fe8-4ea6b69d47ef","Type":"ContainerStarted","Data":"6844f40cb1716e51fac4b5bf0efd698139e685b8b7f6778ffb18ee6dd869ba69"} Mar 13 14:25:57 crc kubenswrapper[4898]: I0313 14:25:57.760354 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-rr9kw"] Mar 13 14:25:57 crc kubenswrapper[4898]: I0313 14:25:57.772708 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-rr9kw"] Mar 13 14:25:57 crc kubenswrapper[4898]: I0313 14:25:57.956568 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-rzjjp"] Mar 13 14:25:57 crc kubenswrapper[4898]: I0313 14:25:57.958216 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:57 crc kubenswrapper[4898]: I0313 14:25:57.962360 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 14:25:57 crc kubenswrapper[4898]: I0313 14:25:57.971068 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-rzjjp"] Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.054076 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdshh\" (UniqueName: \"kubernetes.io/projected/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-kube-api-access-kdshh\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.054192 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-combined-ca-bundle\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.054234 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-scripts\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.054314 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-config-data\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.156673 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdshh\" (UniqueName: \"kubernetes.io/projected/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-kube-api-access-kdshh\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.156749 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-combined-ca-bundle\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.156781 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-scripts\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.156887 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-config-data\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.162701 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-combined-ca-bundle\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.163564 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-scripts\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.163979 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-config-data\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.185611 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdshh\" (UniqueName: \"kubernetes.io/projected/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-kube-api-access-kdshh\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.279739 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.486424 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-5f97b49ff6-67dbr" podUID="0a9180e2-91e9-4063-83a5-5b4ba75ca011" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.231:8004/healthcheck\": read tcp 10.217.0.2:57276->10.217.0.231:8004: read: connection reset by peer" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.879829 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-rzjjp"] Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.988917 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rzjjp" event={"ID":"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe","Type":"ContainerStarted","Data":"f822b9751765ca4646d812007e541e13e282235ebbcdb7da0060744bf5f1941d"} Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.996461 4898 generic.go:334] "Generic (PLEG): container finished" podID="0a9180e2-91e9-4063-83a5-5b4ba75ca011" containerID="357d95fc5d8e0d7678f3ea6e54b764f618b159ca6eaa129aa3499f515c52ea42" exitCode=0 Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.996510 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f97b49ff6-67dbr" event={"ID":"0a9180e2-91e9-4063-83a5-5b4ba75ca011","Type":"ContainerDied","Data":"357d95fc5d8e0d7678f3ea6e54b764f618b159ca6eaa129aa3499f515c52ea42"} Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.133488 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.333858 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data\") pod \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.334012 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-combined-ca-bundle\") pod \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.334041 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-public-tls-certs\") pod \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.334094 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data-custom\") pod \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.334118 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz9q2\" (UniqueName: \"kubernetes.io/projected/0a9180e2-91e9-4063-83a5-5b4ba75ca011-kube-api-access-bz9q2\") pod \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.334230 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-internal-tls-certs\") pod \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.340462 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a9180e2-91e9-4063-83a5-5b4ba75ca011-kube-api-access-bz9q2" (OuterVolumeSpecName: "kube-api-access-bz9q2") pod "0a9180e2-91e9-4063-83a5-5b4ba75ca011" (UID: "0a9180e2-91e9-4063-83a5-5b4ba75ca011"). InnerVolumeSpecName "kube-api-access-bz9q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.370026 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0a9180e2-91e9-4063-83a5-5b4ba75ca011" (UID: "0a9180e2-91e9-4063-83a5-5b4ba75ca011"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.436931 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.436999 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz9q2\" (UniqueName: \"kubernetes.io/projected/0a9180e2-91e9-4063-83a5-5b4ba75ca011-kube-api-access-bz9q2\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.472704 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data" (OuterVolumeSpecName: "config-data") pod "0a9180e2-91e9-4063-83a5-5b4ba75ca011" (UID: "0a9180e2-91e9-4063-83a5-5b4ba75ca011"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.474756 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a9180e2-91e9-4063-83a5-5b4ba75ca011" (UID: "0a9180e2-91e9-4063-83a5-5b4ba75ca011"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.503529 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0a9180e2-91e9-4063-83a5-5b4ba75ca011" (UID: "0a9180e2-91e9-4063-83a5-5b4ba75ca011"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.511682 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0a9180e2-91e9-4063-83a5-5b4ba75ca011" (UID: "0a9180e2-91e9-4063-83a5-5b4ba75ca011"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.539052 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.539074 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.539085 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.539094 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.742735 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:25:59 crc kubenswrapper[4898]: E0313 14:25:59.743109 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.754065 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.784251 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ed6478d-e1a3-4587-813f-222e6c4e54d7" path="/var/lib/kubelet/pods/6ed6478d-e1a3-4587-813f-222e6c4e54d7/volumes" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.845565 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data-custom\") pod \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.845671 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-public-tls-certs\") pod \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.846099 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkpj4\" (UniqueName: \"kubernetes.io/projected/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-kube-api-access-tkpj4\") pod \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.846378 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data\") pod \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.846805 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-internal-tls-certs\") pod \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.846885 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-combined-ca-bundle\") pod \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.851235 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bd18ec2e-1196-4e66-a1c5-9e3daefd7171" (UID: "bd18ec2e-1196-4e66-a1c5-9e3daefd7171"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.851429 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-kube-api-access-tkpj4" (OuterVolumeSpecName: "kube-api-access-tkpj4") pod "bd18ec2e-1196-4e66-a1c5-9e3daefd7171" (UID: "bd18ec2e-1196-4e66-a1c5-9e3daefd7171"). InnerVolumeSpecName "kube-api-access-tkpj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.902703 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd18ec2e-1196-4e66-a1c5-9e3daefd7171" (UID: "bd18ec2e-1196-4e66-a1c5-9e3daefd7171"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.906495 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bd18ec2e-1196-4e66-a1c5-9e3daefd7171" (UID: "bd18ec2e-1196-4e66-a1c5-9e3daefd7171"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.925880 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bd18ec2e-1196-4e66-a1c5-9e3daefd7171" (UID: "bd18ec2e-1196-4e66-a1c5-9e3daefd7171"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.937105 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data" (OuterVolumeSpecName: "config-data") pod "bd18ec2e-1196-4e66-a1c5-9e3daefd7171" (UID: "bd18ec2e-1196-4e66-a1c5-9e3daefd7171"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.951848 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.951884 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.951992 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.952011 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.952023 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkpj4\" (UniqueName: \"kubernetes.io/projected/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-kube-api-access-tkpj4\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.952036 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.010030 4898 generic.go:334] "Generic (PLEG): container finished" podID="6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835" containerID="6ac994a64cbced8d5ed2ad37e427a3eeb5d4669d67bcb7a943f6946233be58c4" exitCode=0 Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.010095 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835","Type":"ContainerDied","Data":"6ac994a64cbced8d5ed2ad37e427a3eeb5d4669d67bcb7a943f6946233be58c4"} Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.022491 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.022486 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f97b49ff6-67dbr" event={"ID":"0a9180e2-91e9-4063-83a5-5b4ba75ca011","Type":"ContainerDied","Data":"f8532eef577636e4f0bae3c5d04fb7af834ed690d3b4263f9713e1e00c75cbcb"} Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.023034 4898 scope.go:117] "RemoveContainer" containerID="357d95fc5d8e0d7678f3ea6e54b764f618b159ca6eaa129aa3499f515c52ea42" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.028527 4898 generic.go:334] "Generic (PLEG): container finished" podID="8d188301-848c-4cf6-a204-e1110714c1be" containerID="0d8797262833812626f4d3e0e1db3d064a9feac6dd8c4aab149c760269a9a573" exitCode=0 Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.028595 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"8d188301-848c-4cf6-a204-e1110714c1be","Type":"ContainerDied","Data":"0d8797262833812626f4d3e0e1db3d064a9feac6dd8c4aab149c760269a9a573"} Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.033879 4898 generic.go:334] "Generic (PLEG): container finished" podID="bd18ec2e-1196-4e66-a1c5-9e3daefd7171" containerID="458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e" exitCode=0 Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.033964 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76b5758c54-vpp67" event={"ID":"bd18ec2e-1196-4e66-a1c5-9e3daefd7171","Type":"ContainerDied","Data":"458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e"} Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.034000 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76b5758c54-vpp67" event={"ID":"bd18ec2e-1196-4e66-a1c5-9e3daefd7171","Type":"ContainerDied","Data":"910b4f0096d337a99b11342bf407ba83fa521765a628b5f816cab829a8a6b279"} Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.034065 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.078321 4898 scope.go:117] "RemoveContainer" containerID="458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.086225 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5f97b49ff6-67dbr"] Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.106705 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5f97b49ff6-67dbr"] Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.132300 4898 scope.go:117] "RemoveContainer" containerID="458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e" Mar 13 14:26:00 crc kubenswrapper[4898]: E0313 14:26:00.151794 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e\": container with ID starting with 458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e not found: ID does not exist" containerID="458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.151852 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e"} err="failed to get container status \"458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e\": rpc error: code = NotFound desc = could not find container \"458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e\": container with ID starting with 458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e not found: ID does not exist" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.164545 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-76b5758c54-vpp67"] Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.188002 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-76b5758c54-vpp67"] Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.285035 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556866-wnmcp"] Mar 13 14:26:00 crc kubenswrapper[4898]: E0313 14:26:00.285692 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a9180e2-91e9-4063-83a5-5b4ba75ca011" containerName="heat-api" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.285711 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9180e2-91e9-4063-83a5-5b4ba75ca011" containerName="heat-api" Mar 13 14:26:00 crc kubenswrapper[4898]: E0313 14:26:00.285760 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd18ec2e-1196-4e66-a1c5-9e3daefd7171" containerName="heat-cfnapi" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.285770 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd18ec2e-1196-4e66-a1c5-9e3daefd7171" containerName="heat-cfnapi" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.286188 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a9180e2-91e9-4063-83a5-5b4ba75ca011" containerName="heat-api" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.286242 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd18ec2e-1196-4e66-a1c5-9e3daefd7171" containerName="heat-cfnapi" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.287664 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.328710 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.328965 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.329156 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.349961 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556866-wnmcp"] Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.472277 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs9zh\" (UniqueName: \"kubernetes.io/projected/45988deb-1057-4d89-a977-35978404b407-kube-api-access-xs9zh\") pod \"auto-csr-approver-29556866-wnmcp\" (UID: \"45988deb-1057-4d89-a977-35978404b407\") " pod="openshift-infra/auto-csr-approver-29556866-wnmcp" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.578545 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs9zh\" (UniqueName: \"kubernetes.io/projected/45988deb-1057-4d89-a977-35978404b407-kube-api-access-xs9zh\") pod \"auto-csr-approver-29556866-wnmcp\" (UID: \"45988deb-1057-4d89-a977-35978404b407\") " pod="openshift-infra/auto-csr-approver-29556866-wnmcp" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.597516 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs9zh\" (UniqueName: \"kubernetes.io/projected/45988deb-1057-4d89-a977-35978404b407-kube-api-access-xs9zh\") pod \"auto-csr-approver-29556866-wnmcp\" (UID: \"45988deb-1057-4d89-a977-35978404b407\") " pod="openshift-infra/auto-csr-approver-29556866-wnmcp" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.677000 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.049424 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835","Type":"ContainerStarted","Data":"27bc08fa0deb28c03994a83019ce147192bbbaed31f89a254828eb70eb13ce4c"} Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.052040 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.057111 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"8d188301-848c-4cf6-a204-e1110714c1be","Type":"ContainerStarted","Data":"a7d95af5df861d956495ba689443c8235e4cfd127a9bbf700f87d52e9cd44b0f"} Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.058540 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.103981 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.103963255 podStartE2EDuration="37.103963255s" podCreationTimestamp="2026-03-13 14:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:26:01.079222395 +0000 UTC m=+1796.080810644" watchObservedRunningTime="2026-03-13 14:26:01.103963255 +0000 UTC m=+1796.105551494" Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.112048 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=44.112001707 podStartE2EDuration="44.112001707s" podCreationTimestamp="2026-03-13 14:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:26:01.102360313 +0000 UTC m=+1796.103948552" watchObservedRunningTime="2026-03-13 14:26:01.112001707 +0000 UTC m=+1796.113589946" Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.218075 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556866-wnmcp"] Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.752124 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a9180e2-91e9-4063-83a5-5b4ba75ca011" path="/var/lib/kubelet/pods/0a9180e2-91e9-4063-83a5-5b4ba75ca011/volumes" Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.752685 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd18ec2e-1196-4e66-a1c5-9e3daefd7171" path="/var/lib/kubelet/pods/bd18ec2e-1196-4e66-a1c5-9e3daefd7171/volumes" Mar 13 14:26:02 crc kubenswrapper[4898]: I0313 14:26:02.106503 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" event={"ID":"45988deb-1057-4d89-a977-35978404b407","Type":"ContainerStarted","Data":"6d6ef88327f35d5663c9790d3655f3042fd1192ba9cb3ae98574961b538d2fe6"} Mar 13 14:26:06 crc kubenswrapper[4898]: I0313 14:26:06.462849 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 14:26:06 crc kubenswrapper[4898]: E0313 14:26:06.813436 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:26:06 crc kubenswrapper[4898]: E0313 14:26:06.815039 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:26:06 crc kubenswrapper[4898]: E0313 14:26:06.818500 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:26:06 crc kubenswrapper[4898]: E0313 14:26:06.818539 4898 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5b6c75676b-jx6kl" podUID="ad94280e-6f02-4129-9cdc-c35499f5d5e4" containerName="heat-engine" Mar 13 14:26:12 crc kubenswrapper[4898]: I0313 14:26:12.271466 4898 generic.go:334] "Generic (PLEG): container finished" podID="ad94280e-6f02-4129-9cdc-c35499f5d5e4" containerID="75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" exitCode=0 Mar 13 14:26:12 crc kubenswrapper[4898]: I0313 14:26:12.271697 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5b6c75676b-jx6kl" event={"ID":"ad94280e-6f02-4129-9cdc-c35499f5d5e4","Type":"ContainerDied","Data":"75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0"} Mar 13 14:26:12 crc kubenswrapper[4898]: I0313 14:26:12.740668 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:26:12 crc kubenswrapper[4898]: E0313 14:26:12.741734 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:26:14 crc kubenswrapper[4898]: E0313 14:26:14.561211 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest" Mar 13 14:26:14 crc kubenswrapper[4898]: E0313 14:26:14.563757 4898 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 14:26:14 crc kubenswrapper[4898]: container &Container{Name:repo-setup-edpm-deployment-openstack-edpm-ipam,Image:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,Command:[],Args:[ansible-runner run /runner -p playbook.yaml -i repo-setup-edpm-deployment-openstack-edpm-ipam],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_VERBOSITY,Value:2,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Mar 13 14:26:14 crc kubenswrapper[4898]: - hosts: all Mar 13 14:26:14 crc kubenswrapper[4898]: strategy: linear Mar 13 14:26:14 crc kubenswrapper[4898]: tasks: Mar 13 14:26:14 crc kubenswrapper[4898]: - name: Enable podified-repos Mar 13 14:26:14 crc kubenswrapper[4898]: become: true Mar 13 14:26:14 crc kubenswrapper[4898]: ansible.builtin.shell: | Mar 13 14:26:14 crc kubenswrapper[4898]: set -euxo pipefail Mar 13 14:26:14 crc kubenswrapper[4898]: pushd /var/tmp Mar 13 14:26:14 crc kubenswrapper[4898]: curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz Mar 13 14:26:14 crc kubenswrapper[4898]: pushd repo-setup-main Mar 13 14:26:14 crc kubenswrapper[4898]: python3 -m venv ./venv Mar 13 14:26:14 crc kubenswrapper[4898]: PBR_VERSION=0.0.0 ./venv/bin/pip install ./ Mar 13 14:26:14 crc kubenswrapper[4898]: ./venv/bin/repo-setup current-podified -b antelope Mar 13 14:26:14 crc kubenswrapper[4898]: popd Mar 13 14:26:14 crc kubenswrapper[4898]: rm -rf repo-setup-main Mar 13 14:26:14 crc kubenswrapper[4898]: Mar 13 14:26:14 crc kubenswrapper[4898]: Mar 13 14:26:14 crc kubenswrapper[4898]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Mar 13 14:26:14 crc kubenswrapper[4898]: edpm_override_hosts: openstack-edpm-ipam Mar 13 14:26:14 crc kubenswrapper[4898]: edpm_service_type: repo-setup Mar 13 14:26:14 crc kubenswrapper[4898]: Mar 13 14:26:14 crc kubenswrapper[4898]: Mar 13 14:26:14 crc kubenswrapper[4898]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:repo-setup-combined-ca-bundle,ReadOnly:false,MountPath:/var/lib/openstack/cacerts/repo-setup,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key-openstack-edpm-ipam,ReadOnly:false,MountPath:/runner/env/ssh_key/ssh_key_openstack-edpm-ipam,SubPath:ssh_key_openstack-edpm-ipam,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory,ReadOnly:false,MountPath:/runner/inventory/hosts,SubPath:inventory,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7ff55,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs_openstack(98336335-4b60-4ddf-8fe8-4ea6b69d47ef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 13 14:26:14 crc kubenswrapper[4898]: > logger="UnhandledError" Mar 13 14:26:14 crc kubenswrapper[4898]: E0313 14:26:14.567249 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" podUID="98336335-4b60-4ddf-8fe8-4ea6b69d47ef" Mar 13 14:26:14 crc kubenswrapper[4898]: I0313 14:26:14.600261 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 14:26:14 crc kubenswrapper[4898]: I0313 14:26:14.869324 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.021122 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.128935 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data\") pod \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.129290 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kskc6\" (UniqueName: \"kubernetes.io/projected/ad94280e-6f02-4129-9cdc-c35499f5d5e4-kube-api-access-kskc6\") pod \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.129498 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data-custom\") pod \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.129569 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-combined-ca-bundle\") pod \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.135620 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad94280e-6f02-4129-9cdc-c35499f5d5e4-kube-api-access-kskc6" (OuterVolumeSpecName: "kube-api-access-kskc6") pod "ad94280e-6f02-4129-9cdc-c35499f5d5e4" (UID: "ad94280e-6f02-4129-9cdc-c35499f5d5e4"). InnerVolumeSpecName "kube-api-access-kskc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.136297 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ad94280e-6f02-4129-9cdc-c35499f5d5e4" (UID: "ad94280e-6f02-4129-9cdc-c35499f5d5e4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.192345 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad94280e-6f02-4129-9cdc-c35499f5d5e4" (UID: "ad94280e-6f02-4129-9cdc-c35499f5d5e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.206692 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data" (OuterVolumeSpecName: "config-data") pod "ad94280e-6f02-4129-9cdc-c35499f5d5e4" (UID: "ad94280e-6f02-4129-9cdc-c35499f5d5e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.233643 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.233678 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kskc6\" (UniqueName: \"kubernetes.io/projected/ad94280e-6f02-4129-9cdc-c35499f5d5e4-kube-api-access-kskc6\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.233688 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.233696 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.327769 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5b6c75676b-jx6kl" event={"ID":"ad94280e-6f02-4129-9cdc-c35499f5d5e4","Type":"ContainerDied","Data":"49d0d8e35c38306e9d9d2a68f113990c44c74cb3b5a7d200ee672f1ee07d5629"} Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.327833 4898 scope.go:117] "RemoveContainer" containerID="75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.327792 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.330708 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" event={"ID":"45988deb-1057-4d89-a977-35978404b407","Type":"ContainerStarted","Data":"81721ca65e97448cfc7621215d98c3ea95987d960df48d9fdaef1b644754dcaf"} Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.333968 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rzjjp" event={"ID":"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe","Type":"ContainerStarted","Data":"86e66a360586b19f53ca12cefc2c560bd5016283db35bb1e56ee1d68892fd634"} Mar 13 14:26:15 crc kubenswrapper[4898]: E0313 14:26:15.338041 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest\\\"\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" podUID="98336335-4b60-4ddf-8fe8-4ea6b69d47ef" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.356567 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" podStartSLOduration=1.957105782 podStartE2EDuration="15.356548787s" podCreationTimestamp="2026-03-13 14:26:00 +0000 UTC" firstStartedPulling="2026-03-13 14:26:01.22816168 +0000 UTC m=+1796.229749919" lastFinishedPulling="2026-03-13 14:26:14.627604675 +0000 UTC m=+1809.629192924" observedRunningTime="2026-03-13 14:26:15.345344962 +0000 UTC m=+1810.346933221" watchObservedRunningTime="2026-03-13 14:26:15.356548787 +0000 UTC m=+1810.358137016" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.374769 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-rzjjp" podStartSLOduration=2.72011789 podStartE2EDuration="18.374749435s" podCreationTimestamp="2026-03-13 14:25:57 +0000 UTC" firstStartedPulling="2026-03-13 14:25:58.941710288 +0000 UTC m=+1793.943298527" lastFinishedPulling="2026-03-13 14:26:14.596341833 +0000 UTC m=+1809.597930072" observedRunningTime="2026-03-13 14:26:15.362844692 +0000 UTC m=+1810.364432931" watchObservedRunningTime="2026-03-13 14:26:15.374749435 +0000 UTC m=+1810.376337674" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.404840 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5b6c75676b-jx6kl"] Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.415655 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-5b6c75676b-jx6kl"] Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.755484 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad94280e-6f02-4129-9cdc-c35499f5d5e4" path="/var/lib/kubelet/pods/ad94280e-6f02-4129-9cdc-c35499f5d5e4/volumes" Mar 13 14:26:16 crc kubenswrapper[4898]: I0313 14:26:16.345217 4898 generic.go:334] "Generic (PLEG): container finished" podID="45988deb-1057-4d89-a977-35978404b407" containerID="81721ca65e97448cfc7621215d98c3ea95987d960df48d9fdaef1b644754dcaf" exitCode=0 Mar 13 14:26:16 crc kubenswrapper[4898]: I0313 14:26:16.345317 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" event={"ID":"45988deb-1057-4d89-a977-35978404b407","Type":"ContainerDied","Data":"81721ca65e97448cfc7621215d98c3ea95987d960df48d9fdaef1b644754dcaf"} Mar 13 14:26:17 crc kubenswrapper[4898]: I0313 14:26:17.658072 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 13 14:26:17 crc kubenswrapper[4898]: I0313 14:26:17.728288 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:26:17 crc kubenswrapper[4898]: I0313 14:26:17.873314 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" Mar 13 14:26:18 crc kubenswrapper[4898]: I0313 14:26:18.013646 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs9zh\" (UniqueName: \"kubernetes.io/projected/45988deb-1057-4d89-a977-35978404b407-kube-api-access-xs9zh\") pod \"45988deb-1057-4d89-a977-35978404b407\" (UID: \"45988deb-1057-4d89-a977-35978404b407\") " Mar 13 14:26:18 crc kubenswrapper[4898]: I0313 14:26:18.021122 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45988deb-1057-4d89-a977-35978404b407-kube-api-access-xs9zh" (OuterVolumeSpecName: "kube-api-access-xs9zh") pod "45988deb-1057-4d89-a977-35978404b407" (UID: "45988deb-1057-4d89-a977-35978404b407"). InnerVolumeSpecName "kube-api-access-xs9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:18 crc kubenswrapper[4898]: I0313 14:26:18.117112 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs9zh\" (UniqueName: \"kubernetes.io/projected/45988deb-1057-4d89-a977-35978404b407-kube-api-access-xs9zh\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:18 crc kubenswrapper[4898]: I0313 14:26:18.381109 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" event={"ID":"45988deb-1057-4d89-a977-35978404b407","Type":"ContainerDied","Data":"6d6ef88327f35d5663c9790d3655f3042fd1192ba9cb3ae98574961b538d2fe6"} Mar 13 14:26:18 crc kubenswrapper[4898]: I0313 14:26:18.381174 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d6ef88327f35d5663c9790d3655f3042fd1192ba9cb3ae98574961b538d2fe6" Mar 13 14:26:18 crc kubenswrapper[4898]: I0313 14:26:18.381176 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" Mar 13 14:26:18 crc kubenswrapper[4898]: I0313 14:26:18.435362 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556860-xbwgj"] Mar 13 14:26:18 crc kubenswrapper[4898]: I0313 14:26:18.450481 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556860-xbwgj"] Mar 13 14:26:19 crc kubenswrapper[4898]: I0313 14:26:19.392263 4898 generic.go:334] "Generic (PLEG): container finished" podID="c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" containerID="86e66a360586b19f53ca12cefc2c560bd5016283db35bb1e56ee1d68892fd634" exitCode=0 Mar 13 14:26:19 crc kubenswrapper[4898]: I0313 14:26:19.393316 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rzjjp" event={"ID":"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe","Type":"ContainerDied","Data":"86e66a360586b19f53ca12cefc2c560bd5016283db35bb1e56ee1d68892fd634"} Mar 13 14:26:19 crc kubenswrapper[4898]: I0313 14:26:19.760130 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02521dff-1dee-4839-ab35-a4bfa82bc405" path="/var/lib/kubelet/pods/02521dff-1dee-4839-ab35-a4bfa82bc405/volumes" Mar 13 14:26:20 crc kubenswrapper[4898]: I0313 14:26:20.873073 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:26:20 crc kubenswrapper[4898]: I0313 14:26:20.983927 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-config-data\") pod \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " Mar 13 14:26:20 crc kubenswrapper[4898]: I0313 14:26:20.984311 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdshh\" (UniqueName: \"kubernetes.io/projected/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-kube-api-access-kdshh\") pod \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " Mar 13 14:26:20 crc kubenswrapper[4898]: I0313 14:26:20.984373 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-scripts\") pod \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " Mar 13 14:26:20 crc kubenswrapper[4898]: I0313 14:26:20.984581 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-combined-ca-bundle\") pod \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " Mar 13 14:26:20 crc kubenswrapper[4898]: I0313 14:26:20.989401 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-scripts" (OuterVolumeSpecName: "scripts") pod "c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" (UID: "c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:20 crc kubenswrapper[4898]: I0313 14:26:20.989469 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-kube-api-access-kdshh" (OuterVolumeSpecName: "kube-api-access-kdshh") pod "c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" (UID: "c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe"). InnerVolumeSpecName "kube-api-access-kdshh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.019940 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" (UID: "c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.034131 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-config-data" (OuterVolumeSpecName: "config-data") pod "c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" (UID: "c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.087511 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.087543 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.087555 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.087564 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdshh\" (UniqueName: \"kubernetes.io/projected/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-kube-api-access-kdshh\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.416738 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rzjjp" event={"ID":"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe","Type":"ContainerDied","Data":"f822b9751765ca4646d812007e541e13e282235ebbcdb7da0060744bf5f1941d"} Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.416777 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f822b9751765ca4646d812007e541e13e282235ebbcdb7da0060744bf5f1941d" Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.416834 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:26:22 crc kubenswrapper[4898]: I0313 14:26:22.131163 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-1" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="rabbitmq" containerID="cri-o://122b0ca68068adfd3963153faa26d42ce9ae7ae836229a17a2096dab37be0af4" gracePeriod=604796 Mar 13 14:26:22 crc kubenswrapper[4898]: I0313 14:26:22.960456 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 14:26:22 crc kubenswrapper[4898]: I0313 14:26:22.961215 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-api" containerID="cri-o://a274deee7baf2c38ad5a6692d8a099f9a97dda17b39d57d5a6fb5cd7aca71860" gracePeriod=30 Mar 13 14:26:22 crc kubenswrapper[4898]: I0313 14:26:22.961323 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-notifier" containerID="cri-o://c83e4fda188ec43992d3ce1b3047566dea50f419ae2e6389d523891cfdc5bf75" gracePeriod=30 Mar 13 14:26:22 crc kubenswrapper[4898]: I0313 14:26:22.961322 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-listener" containerID="cri-o://f1a7b03523d4185dcbadb339dd340f86f0fe7637d1feff68130acfc4930e6831" gracePeriod=30 Mar 13 14:26:22 crc kubenswrapper[4898]: I0313 14:26:22.961358 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-evaluator" containerID="cri-o://8453994fd2156143da3704e7af63c854727a63f89845c2f4e51b2efe260b622e" gracePeriod=30 Mar 13 14:26:23 crc kubenswrapper[4898]: I0313 14:26:23.468963 4898 generic.go:334] "Generic (PLEG): container finished" podID="88246540-ca61-4fb0-8934-c8ebb4559860" containerID="8453994fd2156143da3704e7af63c854727a63f89845c2f4e51b2efe260b622e" exitCode=0 Mar 13 14:26:23 crc kubenswrapper[4898]: I0313 14:26:23.469006 4898 generic.go:334] "Generic (PLEG): container finished" podID="88246540-ca61-4fb0-8934-c8ebb4559860" containerID="a274deee7baf2c38ad5a6692d8a099f9a97dda17b39d57d5a6fb5cd7aca71860" exitCode=0 Mar 13 14:26:23 crc kubenswrapper[4898]: I0313 14:26:23.469038 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerDied","Data":"8453994fd2156143da3704e7af63c854727a63f89845c2f4e51b2efe260b622e"} Mar 13 14:26:23 crc kubenswrapper[4898]: I0313 14:26:23.469077 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerDied","Data":"a274deee7baf2c38ad5a6692d8a099f9a97dda17b39d57d5a6fb5cd7aca71860"} Mar 13 14:26:24 crc kubenswrapper[4898]: I0313 14:26:24.739748 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:26:24 crc kubenswrapper[4898]: E0313 14:26:24.741471 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:26:26 crc kubenswrapper[4898]: I0313 14:26:26.509927 4898 generic.go:334] "Generic (PLEG): container finished" podID="88246540-ca61-4fb0-8934-c8ebb4559860" containerID="f1a7b03523d4185dcbadb339dd340f86f0fe7637d1feff68130acfc4930e6831" exitCode=0 Mar 13 14:26:26 crc kubenswrapper[4898]: I0313 14:26:26.510489 4898 generic.go:334] "Generic (PLEG): container finished" podID="88246540-ca61-4fb0-8934-c8ebb4559860" containerID="c83e4fda188ec43992d3ce1b3047566dea50f419ae2e6389d523891cfdc5bf75" exitCode=0 Mar 13 14:26:26 crc kubenswrapper[4898]: I0313 14:26:26.510517 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerDied","Data":"f1a7b03523d4185dcbadb339dd340f86f0fe7637d1feff68130acfc4930e6831"} Mar 13 14:26:26 crc kubenswrapper[4898]: I0313 14:26:26.510550 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerDied","Data":"c83e4fda188ec43992d3ce1b3047566dea50f419ae2e6389d523891cfdc5bf75"} Mar 13 14:26:26 crc kubenswrapper[4898]: I0313 14:26:26.933588 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.032236 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-config-data\") pod \"88246540-ca61-4fb0-8934-c8ebb4559860\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.032371 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-internal-tls-certs\") pod \"88246540-ca61-4fb0-8934-c8ebb4559860\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.032407 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-public-tls-certs\") pod \"88246540-ca61-4fb0-8934-c8ebb4559860\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.032454 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxbj9\" (UniqueName: \"kubernetes.io/projected/88246540-ca61-4fb0-8934-c8ebb4559860-kube-api-access-sxbj9\") pod \"88246540-ca61-4fb0-8934-c8ebb4559860\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.032544 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-combined-ca-bundle\") pod \"88246540-ca61-4fb0-8934-c8ebb4559860\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.032738 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-scripts\") pod \"88246540-ca61-4fb0-8934-c8ebb4559860\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.039609 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-scripts" (OuterVolumeSpecName: "scripts") pod "88246540-ca61-4fb0-8934-c8ebb4559860" (UID: "88246540-ca61-4fb0-8934-c8ebb4559860"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.070361 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88246540-ca61-4fb0-8934-c8ebb4559860-kube-api-access-sxbj9" (OuterVolumeSpecName: "kube-api-access-sxbj9") pod "88246540-ca61-4fb0-8934-c8ebb4559860" (UID: "88246540-ca61-4fb0-8934-c8ebb4559860"). InnerVolumeSpecName "kube-api-access-sxbj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.112256 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "88246540-ca61-4fb0-8934-c8ebb4559860" (UID: "88246540-ca61-4fb0-8934-c8ebb4559860"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.136602 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.136632 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.136643 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxbj9\" (UniqueName: \"kubernetes.io/projected/88246540-ca61-4fb0-8934-c8ebb4559860-kube-api-access-sxbj9\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.188083 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "88246540-ca61-4fb0-8934-c8ebb4559860" (UID: "88246540-ca61-4fb0-8934-c8ebb4559860"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.215609 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88246540-ca61-4fb0-8934-c8ebb4559860" (UID: "88246540-ca61-4fb0-8934-c8ebb4559860"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.238773 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.238807 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.272765 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.276758 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-config-data" (OuterVolumeSpecName: "config-data") pod "88246540-ca61-4fb0-8934-c8ebb4559860" (UID: "88246540-ca61-4fb0-8934-c8ebb4559860"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.340616 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.524650 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerDied","Data":"4f9554aea31a54e9ad03a3bc5d51fd2b9355c4b2f2434a00fdafabcd84f13b07"} Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.525715 4898 scope.go:117] "RemoveContainer" containerID="f1a7b03523d4185dcbadb339dd340f86f0fe7637d1feff68130acfc4930e6831" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.524711 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.564288 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.566272 4898 scope.go:117] "RemoveContainer" containerID="c83e4fda188ec43992d3ce1b3047566dea50f419ae2e6389d523891cfdc5bf75" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.582855 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.595520 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 13 14:26:27 crc kubenswrapper[4898]: E0313 14:26:27.596082 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-listener" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596102 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-listener" Mar 13 14:26:27 crc kubenswrapper[4898]: E0313 14:26:27.596116 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45988deb-1057-4d89-a977-35978404b407" containerName="oc" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596124 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="45988deb-1057-4d89-a977-35978404b407" containerName="oc" Mar 13 14:26:27 crc kubenswrapper[4898]: E0313 14:26:27.596141 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad94280e-6f02-4129-9cdc-c35499f5d5e4" containerName="heat-engine" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596147 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad94280e-6f02-4129-9cdc-c35499f5d5e4" containerName="heat-engine" Mar 13 14:26:27 crc kubenswrapper[4898]: E0313 14:26:27.596158 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-notifier" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596164 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-notifier" Mar 13 14:26:27 crc kubenswrapper[4898]: E0313 14:26:27.596174 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-evaluator" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596180 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-evaluator" Mar 13 14:26:27 crc kubenswrapper[4898]: E0313 14:26:27.596196 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-api" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596203 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-api" Mar 13 14:26:27 crc kubenswrapper[4898]: E0313 14:26:27.596226 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" containerName="aodh-db-sync" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596233 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" containerName="aodh-db-sync" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596463 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="45988deb-1057-4d89-a977-35978404b407" containerName="oc" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596477 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-api" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596516 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-evaluator" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596533 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad94280e-6f02-4129-9cdc-c35499f5d5e4" containerName="heat-engine" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596545 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-listener" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596557 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" containerName="aodh-db-sync" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596583 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-notifier" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.599137 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.602502 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.602608 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.602831 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.603008 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-tnpwg" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.605795 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.609359 4898 scope.go:117] "RemoveContainer" containerID="8453994fd2156143da3704e7af63c854727a63f89845c2f4e51b2efe260b622e" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.613974 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.655917 4898 scope.go:117] "RemoveContainer" containerID="a274deee7baf2c38ad5a6692d8a099f9a97dda17b39d57d5a6fb5cd7aca71860" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.759441 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.759761 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-public-tls-certs\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.759800 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-config-data\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.759832 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-scripts\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.760095 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-internal-tls-certs\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.760169 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2vv6\" (UniqueName: \"kubernetes.io/projected/a27645af-4d4a-4a73-ba8a-488a9ae199ac-kube-api-access-q2vv6\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.774088 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" path="/var/lib/kubelet/pods/88246540-ca61-4fb0-8934-c8ebb4559860/volumes" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.862606 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-internal-tls-certs\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.862695 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2vv6\" (UniqueName: \"kubernetes.io/projected/a27645af-4d4a-4a73-ba8a-488a9ae199ac-kube-api-access-q2vv6\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.862767 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.862789 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-public-tls-certs\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.862812 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-config-data\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.862828 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-scripts\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.866486 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-public-tls-certs\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.867608 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-scripts\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.867682 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-config-data\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.866510 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.868587 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-internal-tls-certs\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.879471 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2vv6\" (UniqueName: \"kubernetes.io/projected/a27645af-4d4a-4a73-ba8a-488a9ae199ac-kube-api-access-q2vv6\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.935026 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.541079 4898 generic.go:334] "Generic (PLEG): container finished" podID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerID="122b0ca68068adfd3963153faa26d42ce9ae7ae836229a17a2096dab37be0af4" exitCode=0 Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.541154 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"818e3f41-30c4-4a49-b490-0d868fc2b2b8","Type":"ContainerDied","Data":"122b0ca68068adfd3963153faa26d42ce9ae7ae836229a17a2096dab37be0af4"} Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.571674 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 14:26:28 crc kubenswrapper[4898]: W0313 14:26:28.578268 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda27645af_4d4a_4a73_ba8a_488a9ae199ac.slice/crio-069caa01f29af2cd1825eb69d7e0a639ef9e88fb94efb0e624df33b776125cc4 WatchSource:0}: Error finding container 069caa01f29af2cd1825eb69d7e0a639ef9e88fb94efb0e624df33b776125cc4: Status 404 returned error can't find the container with id 069caa01f29af2cd1825eb69d7e0a639ef9e88fb94efb0e624df33b776125cc4 Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.802214 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891137 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891202 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47mbl\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-kube-api-access-47mbl\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891228 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-confd\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891288 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-plugins-conf\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891363 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-erlang-cookie\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891398 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-tls\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891425 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/818e3f41-30c4-4a49-b490-0d868fc2b2b8-erlang-cookie-secret\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891530 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/818e3f41-30c4-4a49-b490-0d868fc2b2b8-pod-info\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891585 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-config-data\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891655 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-server-conf\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891725 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-plugins\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.898702 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.898772 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.899150 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.900513 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-kube-api-access-47mbl" (OuterVolumeSpecName: "kube-api-access-47mbl") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "kube-api-access-47mbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.909000 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.921513 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/818e3f41-30c4-4a49-b490-0d868fc2b2b8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.923090 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/818e3f41-30c4-4a49-b490-0d868fc2b2b8-pod-info" (OuterVolumeSpecName: "pod-info") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.994738 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47mbl\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-kube-api-access-47mbl\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.994765 4898 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.994775 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.994784 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.994791 4898 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/818e3f41-30c4-4a49-b490-0d868fc2b2b8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.994799 4898 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/818e3f41-30c4-4a49-b490-0d868fc2b2b8-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.994809 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.000295 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd" (OuterVolumeSpecName: "persistence") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.035372 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-config-data" (OuterVolumeSpecName: "config-data") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.060370 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-server-conf" (OuterVolumeSpecName: "server-conf") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.097025 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") on node \"crc\" " Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.097058 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.097068 4898 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.132052 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.136212 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.136378 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd") on node "crc" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.198922 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.199045 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.245304 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.557188 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" event={"ID":"98336335-4b60-4ddf-8fe8-4ea6b69d47ef","Type":"ContainerStarted","Data":"f46cdf1ffe4a1df3b8f85aa80dd08fcc8e1c7e3fe707ecfa106895fc9d2db9c6"} Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.561167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a27645af-4d4a-4a73-ba8a-488a9ae199ac","Type":"ContainerStarted","Data":"7f9b37fdf2eb88e248e4ae72f997700dd163658a6cd5a8a4795733e36e8a3376"} Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.561237 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a27645af-4d4a-4a73-ba8a-488a9ae199ac","Type":"ContainerStarted","Data":"069caa01f29af2cd1825eb69d7e0a639ef9e88fb94efb0e624df33b776125cc4"} Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.568986 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"818e3f41-30c4-4a49-b490-0d868fc2b2b8","Type":"ContainerDied","Data":"6fe4bdbf2db945955ec1dd2e86e519172f05f6c43d7d6ac216668fd59e9bda42"} Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.569049 4898 scope.go:117] "RemoveContainer" containerID="122b0ca68068adfd3963153faa26d42ce9ae7ae836229a17a2096dab37be0af4" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.569059 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.587949 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" podStartSLOduration=1.606927483 podStartE2EDuration="34.587922692s" podCreationTimestamp="2026-03-13 14:25:55 +0000 UTC" firstStartedPulling="2026-03-13 14:25:56.258951796 +0000 UTC m=+1791.260540045" lastFinishedPulling="2026-03-13 14:26:29.239947015 +0000 UTC m=+1824.241535254" observedRunningTime="2026-03-13 14:26:29.576836001 +0000 UTC m=+1824.578424240" watchObservedRunningTime="2026-03-13 14:26:29.587922692 +0000 UTC m=+1824.589510931" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.620633 4898 scope.go:117] "RemoveContainer" containerID="6ac94c751f27a4d12d02923377c883f4669b7b2f835e8c6d8eb98e37f2b620ef" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.636405 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.661922 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.681203 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:26:29 crc kubenswrapper[4898]: E0313 14:26:29.682013 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="rabbitmq" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.682124 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="rabbitmq" Mar 13 14:26:29 crc kubenswrapper[4898]: E0313 14:26:29.682199 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="setup-container" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.682258 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="setup-container" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.682670 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="rabbitmq" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.684161 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.705720 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.766309 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" path="/var/lib/kubelet/pods/818e3f41-30c4-4a49-b490-0d868fc2b2b8/volumes" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812556 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-config-data\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812596 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drcbz\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-kube-api-access-drcbz\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812629 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812711 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-server-conf\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812762 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec19264c-1313-492d-b59b-4e5916b988f5-pod-info\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812779 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812813 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec19264c-1313-492d-b59b-4e5916b988f5-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812866 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812908 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812925 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812947 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.916978 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec19264c-1313-492d-b59b-4e5916b988f5-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917107 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917165 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917193 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917228 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917352 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-config-data\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917377 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drcbz\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-kube-api-access-drcbz\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917416 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917504 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-server-conf\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917589 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec19264c-1313-492d-b59b-4e5916b988f5-pod-info\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917614 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.918106 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.918852 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.919966 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-server-conf\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.921777 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.921851 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec19264c-1313-492d-b59b-4e5916b988f5-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.924473 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.928764 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec19264c-1313-492d-b59b-4e5916b988f5-pod-info\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.930091 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.930619 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-config-data\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.937172 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.937201 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/33865dbdc5fe61694c30892e6300309b59f04bdd0b35aa3fd0f17da3ba922194/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.938002 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drcbz\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-kube-api-access-drcbz\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:30 crc kubenswrapper[4898]: I0313 14:26:30.017222 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:30 crc kubenswrapper[4898]: I0313 14:26:30.033234 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 13 14:26:30 crc kubenswrapper[4898]: I0313 14:26:30.817572 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:26:31 crc kubenswrapper[4898]: I0313 14:26:31.604512 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a27645af-4d4a-4a73-ba8a-488a9ae199ac","Type":"ContainerStarted","Data":"1d5290d82c438f5f18f9057d3b331c8974b7eacb9a57fbdafce15f3e7ff99476"} Mar 13 14:26:31 crc kubenswrapper[4898]: I0313 14:26:31.607504 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"ec19264c-1313-492d-b59b-4e5916b988f5","Type":"ContainerStarted","Data":"d99fbf55656e2524d19167512de0b192f6409ba22350297a7849a541960322f8"} Mar 13 14:26:32 crc kubenswrapper[4898]: I0313 14:26:32.622716 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a27645af-4d4a-4a73-ba8a-488a9ae199ac","Type":"ContainerStarted","Data":"d0820d7a1af073092965965e086611242e5d1849b11f6c9bdac24f9d8c8f5a45"} Mar 13 14:26:33 crc kubenswrapper[4898]: I0313 14:26:33.639876 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"ec19264c-1313-492d-b59b-4e5916b988f5","Type":"ContainerStarted","Data":"20f2f7b753b1aa62a0a0192986eb7c604c4b52e002e15ffc2518d76b86a4ad34"} Mar 13 14:26:33 crc kubenswrapper[4898]: I0313 14:26:33.649430 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a27645af-4d4a-4a73-ba8a-488a9ae199ac","Type":"ContainerStarted","Data":"bf5cc7bd1cfdfd09db9bfd9a3a36f84d9ea243b38d5957b5e6359606cf63cae2"} Mar 13 14:26:33 crc kubenswrapper[4898]: I0313 14:26:33.690983 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.169570725 podStartE2EDuration="6.690878715s" podCreationTimestamp="2026-03-13 14:26:27 +0000 UTC" firstStartedPulling="2026-03-13 14:26:28.583080888 +0000 UTC m=+1823.584669127" lastFinishedPulling="2026-03-13 14:26:33.104388878 +0000 UTC m=+1828.105977117" observedRunningTime="2026-03-13 14:26:33.690096414 +0000 UTC m=+1828.691684673" watchObservedRunningTime="2026-03-13 14:26:33.690878715 +0000 UTC m=+1828.692466954" Mar 13 14:26:35 crc kubenswrapper[4898]: I0313 14:26:35.139809 4898 scope.go:117] "RemoveContainer" containerID="c91cc1f40aaa9775749654fff4fe79567271db256673b0ded8ef7bcbbed0be52" Mar 13 14:26:35 crc kubenswrapper[4898]: I0313 14:26:35.290161 4898 scope.go:117] "RemoveContainer" containerID="2dec706ec3d47e7f4d03ac7b64859e218da33cd45852b8891c58c2ae0bd97657" Mar 13 14:26:35 crc kubenswrapper[4898]: I0313 14:26:35.405301 4898 scope.go:117] "RemoveContainer" containerID="ae2833d21b214266cf912f562e6ce0013ae2d27327887e2c914fb19e35f69b2e" Mar 13 14:26:35 crc kubenswrapper[4898]: I0313 14:26:35.486008 4898 scope.go:117] "RemoveContainer" containerID="33dd2d6e0ac7d2f137fe32246deb8758bfab8a7e6e24808a6205586e1001969c" Mar 13 14:26:35 crc kubenswrapper[4898]: I0313 14:26:35.527437 4898 scope.go:117] "RemoveContainer" containerID="36c5ec42fae468ab48602c979ddc403d899fdca66a51026d30aea73028cc2339" Mar 13 14:26:35 crc kubenswrapper[4898]: I0313 14:26:35.606179 4898 scope.go:117] "RemoveContainer" containerID="41174a57eeced7f987ec1ea67f79e818399b7ca23c9e404468462af2e7f7d393" Mar 13 14:26:38 crc kubenswrapper[4898]: I0313 14:26:38.740211 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:26:38 crc kubenswrapper[4898]: E0313 14:26:38.741506 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:26:42 crc kubenswrapper[4898]: I0313 14:26:42.776801 4898 generic.go:334] "Generic (PLEG): container finished" podID="98336335-4b60-4ddf-8fe8-4ea6b69d47ef" containerID="f46cdf1ffe4a1df3b8f85aa80dd08fcc8e1c7e3fe707ecfa106895fc9d2db9c6" exitCode=0 Mar 13 14:26:42 crc kubenswrapper[4898]: I0313 14:26:42.776886 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" event={"ID":"98336335-4b60-4ddf-8fe8-4ea6b69d47ef","Type":"ContainerDied","Data":"f46cdf1ffe4a1df3b8f85aa80dd08fcc8e1c7e3fe707ecfa106895fc9d2db9c6"} Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.406023 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.500868 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-repo-setup-combined-ca-bundle\") pod \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.500964 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-inventory\") pod \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.500987 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-ssh-key-openstack-edpm-ipam\") pod \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.501013 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ff55\" (UniqueName: \"kubernetes.io/projected/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-kube-api-access-7ff55\") pod \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.508032 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-kube-api-access-7ff55" (OuterVolumeSpecName: "kube-api-access-7ff55") pod "98336335-4b60-4ddf-8fe8-4ea6b69d47ef" (UID: "98336335-4b60-4ddf-8fe8-4ea6b69d47ef"). InnerVolumeSpecName "kube-api-access-7ff55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.508266 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "98336335-4b60-4ddf-8fe8-4ea6b69d47ef" (UID: "98336335-4b60-4ddf-8fe8-4ea6b69d47ef"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.544258 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-inventory" (OuterVolumeSpecName: "inventory") pod "98336335-4b60-4ddf-8fe8-4ea6b69d47ef" (UID: "98336335-4b60-4ddf-8fe8-4ea6b69d47ef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.552375 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "98336335-4b60-4ddf-8fe8-4ea6b69d47ef" (UID: "98336335-4b60-4ddf-8fe8-4ea6b69d47ef"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.604994 4898 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.605045 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.605061 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.605077 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ff55\" (UniqueName: \"kubernetes.io/projected/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-kube-api-access-7ff55\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.805553 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" event={"ID":"98336335-4b60-4ddf-8fe8-4ea6b69d47ef","Type":"ContainerDied","Data":"6844f40cb1716e51fac4b5bf0efd698139e685b8b7f6778ffb18ee6dd869ba69"} Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.805618 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6844f40cb1716e51fac4b5bf0efd698139e685b8b7f6778ffb18ee6dd869ba69" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.806130 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.897625 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq"] Mar 13 14:26:44 crc kubenswrapper[4898]: E0313 14:26:44.898441 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98336335-4b60-4ddf-8fe8-4ea6b69d47ef" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.898475 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="98336335-4b60-4ddf-8fe8-4ea6b69d47ef" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.899011 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="98336335-4b60-4ddf-8fe8-4ea6b69d47ef" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.900311 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.902675 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.903557 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.903785 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.903861 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.914999 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq"] Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.018280 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrmdq\" (UniqueName: \"kubernetes.io/projected/6329b434-b1be-4490-9a50-351366b18d79-kube-api-access-nrmdq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.018371 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.018977 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.122376 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.122650 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrmdq\" (UniqueName: \"kubernetes.io/projected/6329b434-b1be-4490-9a50-351366b18d79-kube-api-access-nrmdq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.122775 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.127104 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.127550 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.153500 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrmdq\" (UniqueName: \"kubernetes.io/projected/6329b434-b1be-4490-9a50-351366b18d79-kube-api-access-nrmdq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.253082 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.870284 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq"] Mar 13 14:26:46 crc kubenswrapper[4898]: I0313 14:26:46.836363 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" event={"ID":"6329b434-b1be-4490-9a50-351366b18d79","Type":"ContainerStarted","Data":"aa01dc9810d534e87b07f8107f67665e2f7fd6f556aa8ce55e8be5e8b95511ec"} Mar 13 14:26:46 crc kubenswrapper[4898]: I0313 14:26:46.836819 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" event={"ID":"6329b434-b1be-4490-9a50-351366b18d79","Type":"ContainerStarted","Data":"493fe35f64e9adc97e419785daa4c7d010345822b39f4ae16c4767f16e2efad0"} Mar 13 14:26:46 crc kubenswrapper[4898]: I0313 14:26:46.857338 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" podStartSLOduration=2.3750855189999998 podStartE2EDuration="2.857317846s" podCreationTimestamp="2026-03-13 14:26:44 +0000 UTC" firstStartedPulling="2026-03-13 14:26:45.882757237 +0000 UTC m=+1840.884345476" lastFinishedPulling="2026-03-13 14:26:46.364989554 +0000 UTC m=+1841.366577803" observedRunningTime="2026-03-13 14:26:46.852648143 +0000 UTC m=+1841.854236402" watchObservedRunningTime="2026-03-13 14:26:46.857317846 +0000 UTC m=+1841.858906085" Mar 13 14:26:49 crc kubenswrapper[4898]: I0313 14:26:49.887427 4898 generic.go:334] "Generic (PLEG): container finished" podID="6329b434-b1be-4490-9a50-351366b18d79" containerID="aa01dc9810d534e87b07f8107f67665e2f7fd6f556aa8ce55e8be5e8b95511ec" exitCode=0 Mar 13 14:26:49 crc kubenswrapper[4898]: I0313 14:26:49.888073 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" event={"ID":"6329b434-b1be-4490-9a50-351366b18d79","Type":"ContainerDied","Data":"aa01dc9810d534e87b07f8107f67665e2f7fd6f556aa8ce55e8be5e8b95511ec"} Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.477959 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.623756 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-ssh-key-openstack-edpm-ipam\") pod \"6329b434-b1be-4490-9a50-351366b18d79\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.623928 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-inventory\") pod \"6329b434-b1be-4490-9a50-351366b18d79\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.624000 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrmdq\" (UniqueName: \"kubernetes.io/projected/6329b434-b1be-4490-9a50-351366b18d79-kube-api-access-nrmdq\") pod \"6329b434-b1be-4490-9a50-351366b18d79\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.630438 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6329b434-b1be-4490-9a50-351366b18d79-kube-api-access-nrmdq" (OuterVolumeSpecName: "kube-api-access-nrmdq") pod "6329b434-b1be-4490-9a50-351366b18d79" (UID: "6329b434-b1be-4490-9a50-351366b18d79"). InnerVolumeSpecName "kube-api-access-nrmdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.655853 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-inventory" (OuterVolumeSpecName: "inventory") pod "6329b434-b1be-4490-9a50-351366b18d79" (UID: "6329b434-b1be-4490-9a50-351366b18d79"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.656244 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6329b434-b1be-4490-9a50-351366b18d79" (UID: "6329b434-b1be-4490-9a50-351366b18d79"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.728077 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.728140 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.728159 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrmdq\" (UniqueName: \"kubernetes.io/projected/6329b434-b1be-4490-9a50-351366b18d79-kube-api-access-nrmdq\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.740209 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:26:51 crc kubenswrapper[4898]: E0313 14:26:51.740883 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.912455 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" event={"ID":"6329b434-b1be-4490-9a50-351366b18d79","Type":"ContainerDied","Data":"493fe35f64e9adc97e419785daa4c7d010345822b39f4ae16c4767f16e2efad0"} Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.912979 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="493fe35f64e9adc97e419785daa4c7d010345822b39f4ae16c4767f16e2efad0" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.912541 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.996889 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf"] Mar 13 14:26:51 crc kubenswrapper[4898]: E0313 14:26:51.997775 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6329b434-b1be-4490-9a50-351366b18d79" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.997927 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6329b434-b1be-4490-9a50-351366b18d79" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.998367 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6329b434-b1be-4490-9a50-351366b18d79" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.999503 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.002806 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.003121 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.004201 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.004463 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.014773 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf"] Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.145516 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs68d\" (UniqueName: \"kubernetes.io/projected/6d8bbc5a-39da-48b8-82d1-6df496fda612-kube-api-access-cs68d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.145605 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.145751 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.145775 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.249198 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.249363 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.249391 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.249486 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs68d\" (UniqueName: \"kubernetes.io/projected/6d8bbc5a-39da-48b8-82d1-6df496fda612-kube-api-access-cs68d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.256073 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.258732 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.264023 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.269256 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs68d\" (UniqueName: \"kubernetes.io/projected/6d8bbc5a-39da-48b8-82d1-6df496fda612-kube-api-access-cs68d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.330360 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.912458 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf"] Mar 13 14:26:52 crc kubenswrapper[4898]: W0313 14:26:52.923177 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d8bbc5a_39da_48b8_82d1_6df496fda612.slice/crio-8b67963947b518457f04c307420f088501a95221935f61209e26215c56765170 WatchSource:0}: Error finding container 8b67963947b518457f04c307420f088501a95221935f61209e26215c56765170: Status 404 returned error can't find the container with id 8b67963947b518457f04c307420f088501a95221935f61209e26215c56765170 Mar 13 14:26:53 crc kubenswrapper[4898]: I0313 14:26:53.937384 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" event={"ID":"6d8bbc5a-39da-48b8-82d1-6df496fda612","Type":"ContainerStarted","Data":"c31d294d1bee94e3955b57cdf27c7683e29803f38822e314dcaef8a3865b3cc6"} Mar 13 14:26:53 crc kubenswrapper[4898]: I0313 14:26:53.937918 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" event={"ID":"6d8bbc5a-39da-48b8-82d1-6df496fda612","Type":"ContainerStarted","Data":"8b67963947b518457f04c307420f088501a95221935f61209e26215c56765170"} Mar 13 14:26:53 crc kubenswrapper[4898]: I0313 14:26:53.961402 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" podStartSLOduration=2.508694848 podStartE2EDuration="2.961378787s" podCreationTimestamp="2026-03-13 14:26:51 +0000 UTC" firstStartedPulling="2026-03-13 14:26:52.929967245 +0000 UTC m=+1847.931555484" lastFinishedPulling="2026-03-13 14:26:53.382651174 +0000 UTC m=+1848.384239423" observedRunningTime="2026-03-13 14:26:53.951864887 +0000 UTC m=+1848.953453126" watchObservedRunningTime="2026-03-13 14:26:53.961378787 +0000 UTC m=+1848.962967026" Mar 13 14:27:05 crc kubenswrapper[4898]: I0313 14:27:05.098336 4898 generic.go:334] "Generic (PLEG): container finished" podID="ec19264c-1313-492d-b59b-4e5916b988f5" containerID="20f2f7b753b1aa62a0a0192986eb7c604c4b52e002e15ffc2518d76b86a4ad34" exitCode=0 Mar 13 14:27:05 crc kubenswrapper[4898]: I0313 14:27:05.098465 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"ec19264c-1313-492d-b59b-4e5916b988f5","Type":"ContainerDied","Data":"20f2f7b753b1aa62a0a0192986eb7c604c4b52e002e15ffc2518d76b86a4ad34"} Mar 13 14:27:06 crc kubenswrapper[4898]: I0313 14:27:06.114644 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"ec19264c-1313-492d-b59b-4e5916b988f5","Type":"ContainerStarted","Data":"65f80466add157bce9629bf6cef532d4e04d34dc2f2d0e6d38d22a1f8fc585ec"} Mar 13 14:27:06 crc kubenswrapper[4898]: I0313 14:27:06.115178 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 13 14:27:06 crc kubenswrapper[4898]: I0313 14:27:06.151056 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=37.151034723 podStartE2EDuration="37.151034723s" podCreationTimestamp="2026-03-13 14:26:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:27:06.142885468 +0000 UTC m=+1861.144473737" watchObservedRunningTime="2026-03-13 14:27:06.151034723 +0000 UTC m=+1861.152622962" Mar 13 14:27:06 crc kubenswrapper[4898]: I0313 14:27:06.740107 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:27:06 crc kubenswrapper[4898]: E0313 14:27:06.740765 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:27:20 crc kubenswrapper[4898]: I0313 14:27:20.037190 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 13 14:27:20 crc kubenswrapper[4898]: I0313 14:27:20.147042 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:27:20 crc kubenswrapper[4898]: I0313 14:27:20.739305 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:27:21 crc kubenswrapper[4898]: I0313 14:27:21.335384 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"544b53b5cdd0293005863b343628de53b83869ce5cd2c798b19c01abba2b5bc8"} Mar 13 14:27:24 crc kubenswrapper[4898]: I0313 14:27:24.695294 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="rabbitmq" containerID="cri-o://1483af6a8a31cc8d457a629a4f59e975d8c4af4c9fe53636f469b1c72aabc655" gracePeriod=604796 Mar 13 14:27:27 crc kubenswrapper[4898]: I0313 14:27:27.130264 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.463192 4898 generic.go:334] "Generic (PLEG): container finished" podID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerID="1483af6a8a31cc8d457a629a4f59e975d8c4af4c9fe53636f469b1c72aabc655" exitCode=0 Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.463779 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b","Type":"ContainerDied","Data":"1483af6a8a31cc8d457a629a4f59e975d8c4af4c9fe53636f469b1c72aabc655"} Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.463856 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b","Type":"ContainerDied","Data":"6d987febc3dce3076fcb022da351b81dbf39766acfe2182896067f106a438fc2"} Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.463867 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d987febc3dce3076fcb022da351b81dbf39766acfe2182896067f106a438fc2" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.466531 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.511889 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-config-data\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513114 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513270 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-plugins\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513302 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8xbw\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-kube-api-access-q8xbw\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513335 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-erlang-cookie-secret\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513399 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-tls\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513459 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-erlang-cookie\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513499 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-plugins-conf\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513532 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-pod-info\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513590 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-confd\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513734 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-server-conf\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.514655 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.514940 4898 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.515300 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.516501 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.523080 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-pod-info" (OuterVolumeSpecName: "pod-info") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.529664 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.544069 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-kube-api-access-q8xbw" (OuterVolumeSpecName: "kube-api-access-q8xbw") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "kube-api-access-q8xbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.544747 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.566706 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe" (OuterVolumeSpecName: "persistence") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "pvc-96b86561-77fa-478a-bf61-f7beca9d80fe". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.591469 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-config-data" (OuterVolumeSpecName: "config-data") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.617163 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.617197 4898 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.617208 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.617238 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") on node \"crc\" " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.617249 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.617260 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8xbw\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-kube-api-access-q8xbw\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.617269 4898 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.617278 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.633560 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-server-conf" (OuterVolumeSpecName: "server-conf") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.664183 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.664369 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-96b86561-77fa-478a-bf61-f7beca9d80fe" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe") on node "crc" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.694858 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.719780 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.719813 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.719823 4898 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.475129 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.569003 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.601511 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.619934 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:27:32 crc kubenswrapper[4898]: E0313 14:27:32.620653 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="rabbitmq" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.620677 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="rabbitmq" Mar 13 14:27:32 crc kubenswrapper[4898]: E0313 14:27:32.620726 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="setup-container" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.620736 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="setup-container" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.621047 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="rabbitmq" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.622590 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.661640 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740579 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-config-data\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740652 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740686 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740727 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740762 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740864 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740954 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740978 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740994 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.741010 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.741030 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km2c4\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-kube-api-access-km2c4\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.842590 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.842699 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.842738 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.842753 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.842774 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.842790 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km2c4\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-kube-api-access-km2c4\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.842885 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-config-data\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.843038 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.843067 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.843090 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.843128 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.846987 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.847283 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-config-data\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.848062 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.848704 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.850710 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.851021 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.852029 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.853792 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.854836 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.855839 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.855866 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/235b7df56c251cb078c850d3b743a7085fdda6b090aa4cee8a1308b947278440/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.870626 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km2c4\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-kube-api-access-km2c4\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.925025 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.948500 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 14:27:33 crc kubenswrapper[4898]: I0313 14:27:33.536338 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:27:33 crc kubenswrapper[4898]: I0313 14:27:33.753351 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" path="/var/lib/kubelet/pods/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b/volumes" Mar 13 14:27:34 crc kubenswrapper[4898]: I0313 14:27:34.507329 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b","Type":"ContainerStarted","Data":"05b22667549dcdbfdb3e670ac4505629368dda11f5bddf5b6fe16358c1ffbb17"} Mar 13 14:27:35 crc kubenswrapper[4898]: I0313 14:27:35.901140 4898 scope.go:117] "RemoveContainer" containerID="213db1fb491a2ed6d8dc3d15759978456b725ffef29e1be38661bf279db1daf8" Mar 13 14:27:35 crc kubenswrapper[4898]: I0313 14:27:35.938036 4898 scope.go:117] "RemoveContainer" containerID="6584acdbfa3b269b10be5eacbee652dc5b87853d5dd4647683e70850466d55d5" Mar 13 14:27:36 crc kubenswrapper[4898]: I0313 14:27:36.024112 4898 scope.go:117] "RemoveContainer" containerID="686c9d5260a554140660fb899d995f27d4b2bd420d76c710ada3057e3122cfaf" Mar 13 14:27:36 crc kubenswrapper[4898]: I0313 14:27:36.053451 4898 scope.go:117] "RemoveContainer" containerID="54431181e3eb3b359d0852277dd7b5d798e5ebd64616fc6928567613ce28f709" Mar 13 14:27:36 crc kubenswrapper[4898]: I0313 14:27:36.105818 4898 scope.go:117] "RemoveContainer" containerID="8e18090ad1757c0b15ba6a519121358ec8fea5c9816c6426d3dd165832b431af" Mar 13 14:27:36 crc kubenswrapper[4898]: I0313 14:27:36.139721 4898 scope.go:117] "RemoveContainer" containerID="e1e9eae3fcad1899b5e4ca6e0e525d3dd9661ab59290e1ba7355e6176cbc69f0" Mar 13 14:27:36 crc kubenswrapper[4898]: I0313 14:27:36.186861 4898 scope.go:117] "RemoveContainer" containerID="9dee453ab58c346884f20f75be928701596b9ae0bdbbb025979e1e2daaa907c1" Mar 13 14:27:36 crc kubenswrapper[4898]: I0313 14:27:36.232625 4898 scope.go:117] "RemoveContainer" containerID="1483af6a8a31cc8d457a629a4f59e975d8c4af4c9fe53636f469b1c72aabc655" Mar 13 14:27:36 crc kubenswrapper[4898]: I0313 14:27:36.256078 4898 scope.go:117] "RemoveContainer" containerID="8699a974ed046c0e546ec06ad74b2baeebb668b44968353911ae1775a60f7c87" Mar 13 14:27:36 crc kubenswrapper[4898]: I0313 14:27:36.538211 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b","Type":"ContainerStarted","Data":"816bc1a0bbcb90a875482c4dd5f17ddca4bce3587b6ba90f30bd3e5b12de16d2"} Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.160888 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556868-9gt46"] Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.164257 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556868-9gt46" Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.170296 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.170395 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.170550 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.177549 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556868-9gt46"] Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.201764 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkd24\" (UniqueName: \"kubernetes.io/projected/9565fbbb-2765-4ffb-a934-e5ddf9be1d17-kube-api-access-gkd24\") pod \"auto-csr-approver-29556868-9gt46\" (UID: \"9565fbbb-2765-4ffb-a934-e5ddf9be1d17\") " pod="openshift-infra/auto-csr-approver-29556868-9gt46" Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.304295 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkd24\" (UniqueName: \"kubernetes.io/projected/9565fbbb-2765-4ffb-a934-e5ddf9be1d17-kube-api-access-gkd24\") pod \"auto-csr-approver-29556868-9gt46\" (UID: \"9565fbbb-2765-4ffb-a934-e5ddf9be1d17\") " pod="openshift-infra/auto-csr-approver-29556868-9gt46" Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.330628 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkd24\" (UniqueName: \"kubernetes.io/projected/9565fbbb-2765-4ffb-a934-e5ddf9be1d17-kube-api-access-gkd24\") pod \"auto-csr-approver-29556868-9gt46\" (UID: \"9565fbbb-2765-4ffb-a934-e5ddf9be1d17\") " pod="openshift-infra/auto-csr-approver-29556868-9gt46" Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.505362 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556868-9gt46" Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.996596 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556868-9gt46"] Mar 13 14:28:00 crc kubenswrapper[4898]: W0313 14:28:00.999634 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9565fbbb_2765_4ffb_a934_e5ddf9be1d17.slice/crio-b291bcca631f9ab111749ea520821bf3bb3a186499771a638e9ab9007b740104 WatchSource:0}: Error finding container b291bcca631f9ab111749ea520821bf3bb3a186499771a638e9ab9007b740104: Status 404 returned error can't find the container with id b291bcca631f9ab111749ea520821bf3bb3a186499771a638e9ab9007b740104 Mar 13 14:28:01 crc kubenswrapper[4898]: I0313 14:28:01.903193 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556868-9gt46" event={"ID":"9565fbbb-2765-4ffb-a934-e5ddf9be1d17","Type":"ContainerStarted","Data":"b291bcca631f9ab111749ea520821bf3bb3a186499771a638e9ab9007b740104"} Mar 13 14:28:02 crc kubenswrapper[4898]: I0313 14:28:02.948316 4898 generic.go:334] "Generic (PLEG): container finished" podID="9565fbbb-2765-4ffb-a934-e5ddf9be1d17" containerID="23e456f4a6227ca0f6e6f99f4c35a21b09d57519ec2a733d94a113420fb1a340" exitCode=0 Mar 13 14:28:02 crc kubenswrapper[4898]: I0313 14:28:02.948371 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556868-9gt46" event={"ID":"9565fbbb-2765-4ffb-a934-e5ddf9be1d17","Type":"ContainerDied","Data":"23e456f4a6227ca0f6e6f99f4c35a21b09d57519ec2a733d94a113420fb1a340"} Mar 13 14:28:04 crc kubenswrapper[4898]: I0313 14:28:04.480288 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556868-9gt46" Mar 13 14:28:04 crc kubenswrapper[4898]: I0313 14:28:04.531563 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkd24\" (UniqueName: \"kubernetes.io/projected/9565fbbb-2765-4ffb-a934-e5ddf9be1d17-kube-api-access-gkd24\") pod \"9565fbbb-2765-4ffb-a934-e5ddf9be1d17\" (UID: \"9565fbbb-2765-4ffb-a934-e5ddf9be1d17\") " Mar 13 14:28:04 crc kubenswrapper[4898]: I0313 14:28:04.555624 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9565fbbb-2765-4ffb-a934-e5ddf9be1d17-kube-api-access-gkd24" (OuterVolumeSpecName: "kube-api-access-gkd24") pod "9565fbbb-2765-4ffb-a934-e5ddf9be1d17" (UID: "9565fbbb-2765-4ffb-a934-e5ddf9be1d17"). InnerVolumeSpecName "kube-api-access-gkd24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:28:04 crc kubenswrapper[4898]: I0313 14:28:04.636396 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkd24\" (UniqueName: \"kubernetes.io/projected/9565fbbb-2765-4ffb-a934-e5ddf9be1d17-kube-api-access-gkd24\") on node \"crc\" DevicePath \"\"" Mar 13 14:28:04 crc kubenswrapper[4898]: I0313 14:28:04.981665 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556868-9gt46" event={"ID":"9565fbbb-2765-4ffb-a934-e5ddf9be1d17","Type":"ContainerDied","Data":"b291bcca631f9ab111749ea520821bf3bb3a186499771a638e9ab9007b740104"} Mar 13 14:28:04 crc kubenswrapper[4898]: I0313 14:28:04.982019 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b291bcca631f9ab111749ea520821bf3bb3a186499771a638e9ab9007b740104" Mar 13 14:28:04 crc kubenswrapper[4898]: I0313 14:28:04.981725 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556868-9gt46" Mar 13 14:28:05 crc kubenswrapper[4898]: I0313 14:28:05.583918 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556862-mpx4w"] Mar 13 14:28:05 crc kubenswrapper[4898]: I0313 14:28:05.599757 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556862-mpx4w"] Mar 13 14:28:05 crc kubenswrapper[4898]: I0313 14:28:05.756921 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c85cb04-363e-45d6-a14b-79c249e8f469" path="/var/lib/kubelet/pods/3c85cb04-363e-45d6-a14b-79c249e8f469/volumes" Mar 13 14:28:09 crc kubenswrapper[4898]: I0313 14:28:09.037645 4898 generic.go:334] "Generic (PLEG): container finished" podID="10c321a0-5ea5-4b5c-8695-1f7b2dcad32b" containerID="816bc1a0bbcb90a875482c4dd5f17ddca4bce3587b6ba90f30bd3e5b12de16d2" exitCode=0 Mar 13 14:28:09 crc kubenswrapper[4898]: I0313 14:28:09.037703 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b","Type":"ContainerDied","Data":"816bc1a0bbcb90a875482c4dd5f17ddca4bce3587b6ba90f30bd3e5b12de16d2"} Mar 13 14:28:10 crc kubenswrapper[4898]: I0313 14:28:10.053456 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b","Type":"ContainerStarted","Data":"1975e8515d40d0e0882f9032c3cf285be83ffad9d05ae4d3245e76efc73f0dda"} Mar 13 14:28:10 crc kubenswrapper[4898]: I0313 14:28:10.054263 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 14:28:10 crc kubenswrapper[4898]: I0313 14:28:10.109028 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.109003836 podStartE2EDuration="38.109003836s" podCreationTimestamp="2026-03-13 14:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:28:10.086340291 +0000 UTC m=+1925.087928560" watchObservedRunningTime="2026-03-13 14:28:10.109003836 +0000 UTC m=+1925.110592075" Mar 13 14:28:22 crc kubenswrapper[4898]: I0313 14:28:22.952119 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 14:28:36 crc kubenswrapper[4898]: I0313 14:28:36.419432 4898 scope.go:117] "RemoveContainer" containerID="b62d7bd0ca3497c43d915b6212935946bc82ac1a3defe8c89eeb3779d6ce9770" Mar 13 14:28:36 crc kubenswrapper[4898]: I0313 14:28:36.455968 4898 scope.go:117] "RemoveContainer" containerID="ba94a825cfb36ee16c3e15907274f9276083ba448d310d471374f19c54cc116c" Mar 13 14:28:36 crc kubenswrapper[4898]: I0313 14:28:36.487311 4898 scope.go:117] "RemoveContainer" containerID="0d9a86d7e906b1015484fb8d8fc360af30fd3aeeed5cfca12314a74be70b110a" Mar 13 14:28:36 crc kubenswrapper[4898]: I0313 14:28:36.547459 4898 scope.go:117] "RemoveContainer" containerID="cea1936f2758016544cbefa24e4ca686c3e33acfdaf019898c501a90320d0242" Mar 13 14:28:36 crc kubenswrapper[4898]: I0313 14:28:36.582165 4898 scope.go:117] "RemoveContainer" containerID="a55576c9a44e83505bf8757afc0e1e19424b4717e80f08c508a794c81f2cfdb0" Mar 13 14:28:36 crc kubenswrapper[4898]: I0313 14:28:36.608478 4898 scope.go:117] "RemoveContainer" containerID="c6cbd243a4ab0ae3ee88d2b14b07e0b9c8fda594949edb73fc92248b8f25ddf9" Mar 13 14:29:36 crc kubenswrapper[4898]: I0313 14:29:36.729997 4898 scope.go:117] "RemoveContainer" containerID="c0fcc6916c9c7951ac6f57b54e64b861fe8be03a65443f0a0008c4f458405d78" Mar 13 14:29:36 crc kubenswrapper[4898]: I0313 14:29:36.756091 4898 scope.go:117] "RemoveContainer" containerID="9bd9f3f02e15571b11f72778527812906d168be88196cc4314aa88f5c276ac6c" Mar 13 14:29:36 crc kubenswrapper[4898]: I0313 14:29:36.784231 4898 scope.go:117] "RemoveContainer" containerID="ceacbe8778fcc11e62876d98d598259e725fa8302adf85af1d1ddc9df4d62ff6" Mar 13 14:29:36 crc kubenswrapper[4898]: I0313 14:29:36.823200 4898 scope.go:117] "RemoveContainer" containerID="a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b" Mar 13 14:29:36 crc kubenswrapper[4898]: I0313 14:29:36.908416 4898 scope.go:117] "RemoveContainer" containerID="7d83563b664dd524f060763bd5deadd7009b47fedbc88f53844517f4c00a64ea" Mar 13 14:29:36 crc kubenswrapper[4898]: I0313 14:29:36.957241 4898 scope.go:117] "RemoveContainer" containerID="4d8bf97f6df1c8578abf9d8b2ff9a16d6f36d0a198628e241eb7ec672c4d77c5" Mar 13 14:29:37 crc kubenswrapper[4898]: I0313 14:29:37.008257 4898 scope.go:117] "RemoveContainer" containerID="b97169c17bbe4e26153e2ab8b910eb60fbafe00f2669ecd0f29ce4eb8dba08e0" Mar 13 14:29:37 crc kubenswrapper[4898]: I0313 14:29:37.070398 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-baf6-account-create-update-xhptm"] Mar 13 14:29:37 crc kubenswrapper[4898]: I0313 14:29:37.082795 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-n7qmc"] Mar 13 14:29:37 crc kubenswrapper[4898]: I0313 14:29:37.102825 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-n7qmc"] Mar 13 14:29:37 crc kubenswrapper[4898]: I0313 14:29:37.121599 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-baf6-account-create-update-xhptm"] Mar 13 14:29:37 crc kubenswrapper[4898]: I0313 14:29:37.765399 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45215dff-dfeb-4b68-bc5c-d36aba0ea6b8" path="/var/lib/kubelet/pods/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8/volumes" Mar 13 14:29:37 crc kubenswrapper[4898]: I0313 14:29:37.770060 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81f4ee1a-c4d2-415d-9021-6503f03f8441" path="/var/lib/kubelet/pods/81f4ee1a-c4d2-415d-9021-6503f03f8441/volumes" Mar 13 14:29:40 crc kubenswrapper[4898]: I0313 14:29:40.037472 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-621e-account-create-update-dksd9"] Mar 13 14:29:40 crc kubenswrapper[4898]: I0313 14:29:40.054932 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-cdnq7"] Mar 13 14:29:40 crc kubenswrapper[4898]: I0313 14:29:40.067711 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-621e-account-create-update-dksd9"] Mar 13 14:29:40 crc kubenswrapper[4898]: I0313 14:29:40.079985 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-cdnq7"] Mar 13 14:29:41 crc kubenswrapper[4898]: I0313 14:29:41.308632 4898 generic.go:334] "Generic (PLEG): container finished" podID="6d8bbc5a-39da-48b8-82d1-6df496fda612" containerID="c31d294d1bee94e3955b57cdf27c7683e29803f38822e314dcaef8a3865b3cc6" exitCode=0 Mar 13 14:29:41 crc kubenswrapper[4898]: I0313 14:29:41.309006 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" event={"ID":"6d8bbc5a-39da-48b8-82d1-6df496fda612","Type":"ContainerDied","Data":"c31d294d1bee94e3955b57cdf27c7683e29803f38822e314dcaef8a3865b3cc6"} Mar 13 14:29:41 crc kubenswrapper[4898]: I0313 14:29:41.764652 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59bdafe7-9c43-4acc-a212-864bdf38d5b4" path="/var/lib/kubelet/pods/59bdafe7-9c43-4acc-a212-864bdf38d5b4/volumes" Mar 13 14:29:41 crc kubenswrapper[4898]: I0313 14:29:41.767708 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c46fcc-fd9b-4073-99e6-28aadcdd823e" path="/var/lib/kubelet/pods/a8c46fcc-fd9b-4073-99e6-28aadcdd823e/volumes" Mar 13 14:29:42 crc kubenswrapper[4898]: I0313 14:29:42.938316 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.032855 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-inventory\") pod \"6d8bbc5a-39da-48b8-82d1-6df496fda612\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.033208 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-ssh-key-openstack-edpm-ipam\") pod \"6d8bbc5a-39da-48b8-82d1-6df496fda612\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.033837 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-bootstrap-combined-ca-bundle\") pod \"6d8bbc5a-39da-48b8-82d1-6df496fda612\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.034025 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs68d\" (UniqueName: \"kubernetes.io/projected/6d8bbc5a-39da-48b8-82d1-6df496fda612-kube-api-access-cs68d\") pod \"6d8bbc5a-39da-48b8-82d1-6df496fda612\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.039275 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d8bbc5a-39da-48b8-82d1-6df496fda612-kube-api-access-cs68d" (OuterVolumeSpecName: "kube-api-access-cs68d") pod "6d8bbc5a-39da-48b8-82d1-6df496fda612" (UID: "6d8bbc5a-39da-48b8-82d1-6df496fda612"). InnerVolumeSpecName "kube-api-access-cs68d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.041079 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6d8bbc5a-39da-48b8-82d1-6df496fda612" (UID: "6d8bbc5a-39da-48b8-82d1-6df496fda612"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.068199 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6d8bbc5a-39da-48b8-82d1-6df496fda612" (UID: "6d8bbc5a-39da-48b8-82d1-6df496fda612"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.081767 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-inventory" (OuterVolumeSpecName: "inventory") pod "6d8bbc5a-39da-48b8-82d1-6df496fda612" (UID: "6d8bbc5a-39da-48b8-82d1-6df496fda612"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.136288 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.136323 4898 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.136338 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs68d\" (UniqueName: \"kubernetes.io/projected/6d8bbc5a-39da-48b8-82d1-6df496fda612-kube-api-access-cs68d\") on node \"crc\" DevicePath \"\"" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.136351 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.337414 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" event={"ID":"6d8bbc5a-39da-48b8-82d1-6df496fda612","Type":"ContainerDied","Data":"8b67963947b518457f04c307420f088501a95221935f61209e26215c56765170"} Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.337724 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b67963947b518457f04c307420f088501a95221935f61209e26215c56765170" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.337921 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.457961 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg"] Mar 13 14:29:43 crc kubenswrapper[4898]: E0313 14:29:43.458613 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9565fbbb-2765-4ffb-a934-e5ddf9be1d17" containerName="oc" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.458634 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9565fbbb-2765-4ffb-a934-e5ddf9be1d17" containerName="oc" Mar 13 14:29:43 crc kubenswrapper[4898]: E0313 14:29:43.458702 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8bbc5a-39da-48b8-82d1-6df496fda612" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.458712 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8bbc5a-39da-48b8-82d1-6df496fda612" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.459032 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9565fbbb-2765-4ffb-a934-e5ddf9be1d17" containerName="oc" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.459058 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8bbc5a-39da-48b8-82d1-6df496fda612" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.460937 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.463832 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.464059 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.464356 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.464764 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.470933 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg"] Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.646667 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v49j\" (UniqueName: \"kubernetes.io/projected/05e315eb-34b1-4099-b676-b0238f3cb5c5-kube-api-access-2v49j\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.646768 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.646864 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.748889 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v49j\" (UniqueName: \"kubernetes.io/projected/05e315eb-34b1-4099-b676-b0238f3cb5c5-kube-api-access-2v49j\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.749699 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.749973 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.756119 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.757230 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.779349 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v49j\" (UniqueName: \"kubernetes.io/projected/05e315eb-34b1-4099-b676-b0238f3cb5c5-kube-api-access-2v49j\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.789309 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:44 crc kubenswrapper[4898]: W0313 14:29:44.465509 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05e315eb_34b1_4099_b676_b0238f3cb5c5.slice/crio-56c75088356ea1f877eddaa52133cb1069449c5912814655f61681146f527468 WatchSource:0}: Error finding container 56c75088356ea1f877eddaa52133cb1069449c5912814655f61681146f527468: Status 404 returned error can't find the container with id 56c75088356ea1f877eddaa52133cb1069449c5912814655f61681146f527468 Mar 13 14:29:44 crc kubenswrapper[4898]: I0313 14:29:44.474333 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg"] Mar 13 14:29:45 crc kubenswrapper[4898]: I0313 14:29:45.369842 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" event={"ID":"05e315eb-34b1-4099-b676-b0238f3cb5c5","Type":"ContainerStarted","Data":"56c75088356ea1f877eddaa52133cb1069449c5912814655f61681146f527468"} Mar 13 14:29:46 crc kubenswrapper[4898]: I0313 14:29:46.050139 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d2b7-account-create-update-ggzw8"] Mar 13 14:29:46 crc kubenswrapper[4898]: I0313 14:29:46.065374 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-ppqg7"] Mar 13 14:29:46 crc kubenswrapper[4898]: I0313 14:29:46.078716 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d2b7-account-create-update-ggzw8"] Mar 13 14:29:46 crc kubenswrapper[4898]: I0313 14:29:46.090727 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-ppqg7"] Mar 13 14:29:46 crc kubenswrapper[4898]: I0313 14:29:46.403719 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" event={"ID":"05e315eb-34b1-4099-b676-b0238f3cb5c5","Type":"ContainerStarted","Data":"19401614e37155924147d2b68ea4d922bec3bbc6d22b8dd77602f823a33552a5"} Mar 13 14:29:46 crc kubenswrapper[4898]: I0313 14:29:46.433867 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" podStartSLOduration=2.937411699 podStartE2EDuration="3.433837683s" podCreationTimestamp="2026-03-13 14:29:43 +0000 UTC" firstStartedPulling="2026-03-13 14:29:44.46841864 +0000 UTC m=+2019.470006889" lastFinishedPulling="2026-03-13 14:29:44.964844584 +0000 UTC m=+2019.966432873" observedRunningTime="2026-03-13 14:29:46.429708066 +0000 UTC m=+2021.431296345" watchObservedRunningTime="2026-03-13 14:29:46.433837683 +0000 UTC m=+2021.435425952" Mar 13 14:29:47 crc kubenswrapper[4898]: I0313 14:29:47.051421 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-35b4-account-create-update-7rdfs"] Mar 13 14:29:47 crc kubenswrapper[4898]: I0313 14:29:47.068235 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-35b4-account-create-update-7rdfs"] Mar 13 14:29:47 crc kubenswrapper[4898]: I0313 14:29:47.083320 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-zzflk"] Mar 13 14:29:47 crc kubenswrapper[4898]: I0313 14:29:47.097877 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-zzflk"] Mar 13 14:29:47 crc kubenswrapper[4898]: I0313 14:29:47.753886 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bf799d3-e4d4-439d-b3da-d5467064f6f1" path="/var/lib/kubelet/pods/0bf799d3-e4d4-439d-b3da-d5467064f6f1/volumes" Mar 13 14:29:47 crc kubenswrapper[4898]: I0313 14:29:47.754824 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a060a9-dd52-4192-bc48-b9ea7a918458" path="/var/lib/kubelet/pods/32a060a9-dd52-4192-bc48-b9ea7a918458/volumes" Mar 13 14:29:47 crc kubenswrapper[4898]: I0313 14:29:47.755785 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc61df36-ac68-4cf0-9456-140bccb5435c" path="/var/lib/kubelet/pods/bc61df36-ac68-4cf0-9456-140bccb5435c/volumes" Mar 13 14:29:47 crc kubenswrapper[4898]: I0313 14:29:47.757309 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f58c984f-f43f-42dc-90a5-aebbe79a47a5" path="/var/lib/kubelet/pods/f58c984f-f43f-42dc-90a5-aebbe79a47a5/volumes" Mar 13 14:29:48 crc kubenswrapper[4898]: I0313 14:29:48.053772 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-ce03-account-create-update-425qr"] Mar 13 14:29:48 crc kubenswrapper[4898]: I0313 14:29:48.062933 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-n82b8"] Mar 13 14:29:48 crc kubenswrapper[4898]: I0313 14:29:48.071806 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-ce03-account-create-update-425qr"] Mar 13 14:29:48 crc kubenswrapper[4898]: I0313 14:29:48.106207 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-n82b8"] Mar 13 14:29:49 crc kubenswrapper[4898]: I0313 14:29:49.134226 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:29:49 crc kubenswrapper[4898]: I0313 14:29:49.134511 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:29:49 crc kubenswrapper[4898]: I0313 14:29:49.762645 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="586ccc66-1989-46e5-98ad-b70c7e88e6bc" path="/var/lib/kubelet/pods/586ccc66-1989-46e5-98ad-b70c7e88e6bc/volumes" Mar 13 14:29:49 crc kubenswrapper[4898]: I0313 14:29:49.764133 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba5ed93a-91b4-4942-a32c-ab02a536e3d4" path="/var/lib/kubelet/pods/ba5ed93a-91b4-4942-a32c-ab02a536e3d4/volumes" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.173785 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p"] Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.181814 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.188498 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.188736 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.195601 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556870-ndlzh"] Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.198077 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.207546 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.207606 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.207543 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.233415 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p"] Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.238266 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rf6q\" (UniqueName: \"kubernetes.io/projected/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-kube-api-access-5rf6q\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.238356 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-secret-volume\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.238381 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-config-volume\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.244652 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556870-ndlzh"] Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.340268 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpjbp\" (UniqueName: \"kubernetes.io/projected/c4f21c0b-a6a1-4b44-ae38-4a382569154e-kube-api-access-wpjbp\") pod \"auto-csr-approver-29556870-ndlzh\" (UID: \"c4f21c0b-a6a1-4b44-ae38-4a382569154e\") " pod="openshift-infra/auto-csr-approver-29556870-ndlzh" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.340374 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rf6q\" (UniqueName: \"kubernetes.io/projected/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-kube-api-access-5rf6q\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.340456 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-secret-volume\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.340483 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-config-volume\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.341386 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-config-volume\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.347500 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-secret-volume\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.370563 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rf6q\" (UniqueName: \"kubernetes.io/projected/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-kube-api-access-5rf6q\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.444338 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpjbp\" (UniqueName: \"kubernetes.io/projected/c4f21c0b-a6a1-4b44-ae38-4a382569154e-kube-api-access-wpjbp\") pod \"auto-csr-approver-29556870-ndlzh\" (UID: \"c4f21c0b-a6a1-4b44-ae38-4a382569154e\") " pod="openshift-infra/auto-csr-approver-29556870-ndlzh" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.485873 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpjbp\" (UniqueName: \"kubernetes.io/projected/c4f21c0b-a6a1-4b44-ae38-4a382569154e-kube-api-access-wpjbp\") pod \"auto-csr-approver-29556870-ndlzh\" (UID: \"c4f21c0b-a6a1-4b44-ae38-4a382569154e\") " pod="openshift-infra/auto-csr-approver-29556870-ndlzh" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.534972 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.551485 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" Mar 13 14:30:01 crc kubenswrapper[4898]: I0313 14:30:01.097154 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p"] Mar 13 14:30:01 crc kubenswrapper[4898]: W0313 14:30:01.105021 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod924ed9fd_c9e5_4462_9b97_6d6cd1e8ea19.slice/crio-c035e8c8fdb2f9bf143370be15075f63ee2778c239b799bd630e23b7df793c32 WatchSource:0}: Error finding container c035e8c8fdb2f9bf143370be15075f63ee2778c239b799bd630e23b7df793c32: Status 404 returned error can't find the container with id c035e8c8fdb2f9bf143370be15075f63ee2778c239b799bd630e23b7df793c32 Mar 13 14:30:01 crc kubenswrapper[4898]: I0313 14:30:01.154789 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556870-ndlzh"] Mar 13 14:30:01 crc kubenswrapper[4898]: W0313 14:30:01.163138 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4f21c0b_a6a1_4b44_ae38_4a382569154e.slice/crio-1309792d992296d5da9b2b9feb7fd0fd0cb4a0e583dd0d5108ab2a673b6d9c0f WatchSource:0}: Error finding container 1309792d992296d5da9b2b9feb7fd0fd0cb4a0e583dd0d5108ab2a673b6d9c0f: Status 404 returned error can't find the container with id 1309792d992296d5da9b2b9feb7fd0fd0cb4a0e583dd0d5108ab2a673b6d9c0f Mar 13 14:30:01 crc kubenswrapper[4898]: I0313 14:30:01.626159 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" event={"ID":"c4f21c0b-a6a1-4b44-ae38-4a382569154e","Type":"ContainerStarted","Data":"1309792d992296d5da9b2b9feb7fd0fd0cb4a0e583dd0d5108ab2a673b6d9c0f"} Mar 13 14:30:01 crc kubenswrapper[4898]: I0313 14:30:01.628694 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" event={"ID":"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19","Type":"ContainerStarted","Data":"64e5f9a40c7865406038ca1466eaa2acb4b0149c1cbab8a03ef694cc78da5ccc"} Mar 13 14:30:01 crc kubenswrapper[4898]: I0313 14:30:01.628750 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" event={"ID":"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19","Type":"ContainerStarted","Data":"c035e8c8fdb2f9bf143370be15075f63ee2778c239b799bd630e23b7df793c32"} Mar 13 14:30:01 crc kubenswrapper[4898]: I0313 14:30:01.665828 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" podStartSLOduration=1.665798295 podStartE2EDuration="1.665798295s" podCreationTimestamp="2026-03-13 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:30:01.649976394 +0000 UTC m=+2036.651564663" watchObservedRunningTime="2026-03-13 14:30:01.665798295 +0000 UTC m=+2036.667386574" Mar 13 14:30:02 crc kubenswrapper[4898]: I0313 14:30:02.648489 4898 generic.go:334] "Generic (PLEG): container finished" podID="924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19" containerID="64e5f9a40c7865406038ca1466eaa2acb4b0149c1cbab8a03ef694cc78da5ccc" exitCode=0 Mar 13 14:30:02 crc kubenswrapper[4898]: I0313 14:30:02.648840 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" event={"ID":"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19","Type":"ContainerDied","Data":"64e5f9a40c7865406038ca1466eaa2acb4b0149c1cbab8a03ef694cc78da5ccc"} Mar 13 14:30:03 crc kubenswrapper[4898]: I0313 14:30:03.662987 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" event={"ID":"c4f21c0b-a6a1-4b44-ae38-4a382569154e","Type":"ContainerStarted","Data":"b1aa895f0022b3e7758a22e19e58e18bbca3560c415ba389688c7c2191911abd"} Mar 13 14:30:03 crc kubenswrapper[4898]: I0313 14:30:03.682822 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" podStartSLOduration=2.218463383 podStartE2EDuration="3.682795319s" podCreationTimestamp="2026-03-13 14:30:00 +0000 UTC" firstStartedPulling="2026-03-13 14:30:01.165091079 +0000 UTC m=+2036.166679338" lastFinishedPulling="2026-03-13 14:30:02.629423015 +0000 UTC m=+2037.631011274" observedRunningTime="2026-03-13 14:30:03.680074368 +0000 UTC m=+2038.681662607" watchObservedRunningTime="2026-03-13 14:30:03.682795319 +0000 UTC m=+2038.684383598" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.005803 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.146308 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-secret-volume\") pod \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.146689 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rf6q\" (UniqueName: \"kubernetes.io/projected/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-kube-api-access-5rf6q\") pod \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.146758 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-config-volume\") pod \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.147966 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-config-volume" (OuterVolumeSpecName: "config-volume") pod "924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19" (UID: "924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.152357 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19" (UID: "924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.152508 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-kube-api-access-5rf6q" (OuterVolumeSpecName: "kube-api-access-5rf6q") pod "924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19" (UID: "924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19"). InnerVolumeSpecName "kube-api-access-5rf6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.250157 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.250202 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rf6q\" (UniqueName: \"kubernetes.io/projected/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-kube-api-access-5rf6q\") on node \"crc\" DevicePath \"\"" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.250215 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.687505 4898 generic.go:334] "Generic (PLEG): container finished" podID="c4f21c0b-a6a1-4b44-ae38-4a382569154e" containerID="b1aa895f0022b3e7758a22e19e58e18bbca3560c415ba389688c7c2191911abd" exitCode=0 Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.688120 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" event={"ID":"c4f21c0b-a6a1-4b44-ae38-4a382569154e","Type":"ContainerDied","Data":"b1aa895f0022b3e7758a22e19e58e18bbca3560c415ba389688c7c2191911abd"} Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.693239 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" event={"ID":"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19","Type":"ContainerDied","Data":"c035e8c8fdb2f9bf143370be15075f63ee2778c239b799bd630e23b7df793c32"} Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.693303 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c035e8c8fdb2f9bf143370be15075f63ee2778c239b799bd630e23b7df793c32" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.693413 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.109508 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.204886 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpjbp\" (UniqueName: \"kubernetes.io/projected/c4f21c0b-a6a1-4b44-ae38-4a382569154e-kube-api-access-wpjbp\") pod \"c4f21c0b-a6a1-4b44-ae38-4a382569154e\" (UID: \"c4f21c0b-a6a1-4b44-ae38-4a382569154e\") " Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.210826 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4f21c0b-a6a1-4b44-ae38-4a382569154e-kube-api-access-wpjbp" (OuterVolumeSpecName: "kube-api-access-wpjbp") pod "c4f21c0b-a6a1-4b44-ae38-4a382569154e" (UID: "c4f21c0b-a6a1-4b44-ae38-4a382569154e"). InnerVolumeSpecName "kube-api-access-wpjbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.308023 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpjbp\" (UniqueName: \"kubernetes.io/projected/c4f21c0b-a6a1-4b44-ae38-4a382569154e-kube-api-access-wpjbp\") on node \"crc\" DevicePath \"\"" Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.717810 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" event={"ID":"c4f21c0b-a6a1-4b44-ae38-4a382569154e","Type":"ContainerDied","Data":"1309792d992296d5da9b2b9feb7fd0fd0cb4a0e583dd0d5108ab2a673b6d9c0f"} Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.717856 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1309792d992296d5da9b2b9feb7fd0fd0cb4a0e583dd0d5108ab2a673b6d9c0f" Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.717943 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.759226 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556864-bvpnd"] Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.774167 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556864-bvpnd"] Mar 13 14:30:07 crc kubenswrapper[4898]: I0313 14:30:07.756179 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e5f381c-bbd8-40d9-8c76-efee5fb7023a" path="/var/lib/kubelet/pods/4e5f381c-bbd8-40d9-8c76-efee5fb7023a/volumes" Mar 13 14:30:14 crc kubenswrapper[4898]: I0313 14:30:14.032170 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-x75zk"] Mar 13 14:30:14 crc kubenswrapper[4898]: I0313 14:30:14.042783 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-x75zk"] Mar 13 14:30:15 crc kubenswrapper[4898]: I0313 14:30:15.756393 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d4aeca-15ec-4f63-87e0-20daa6f3e70f" path="/var/lib/kubelet/pods/74d4aeca-15ec-4f63-87e0-20daa6f3e70f/volumes" Mar 13 14:30:19 crc kubenswrapper[4898]: I0313 14:30:19.135213 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:30:19 crc kubenswrapper[4898]: I0313 14:30:19.135888 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.070064 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4e00-account-create-update-92bgz"] Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.094971 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-88gdv"] Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.106974 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-275nk"] Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.126054 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-88gdv"] Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.141103 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4e00-account-create-update-92bgz"] Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.163726 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-275nk"] Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.764802 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fe3416e-f08a-43c9-8e12-a89c1e849208" path="/var/lib/kubelet/pods/4fe3416e-f08a-43c9-8e12-a89c1e849208/volumes" Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.767705 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b04d3edd-a550-465a-9ef2-2cbea4126ceb" path="/var/lib/kubelet/pods/b04d3edd-a550-465a-9ef2-2cbea4126ceb/volumes" Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.771410 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8a8516c-5aee-4eae-a59b-498f97c1b92b" path="/var/lib/kubelet/pods/f8a8516c-5aee-4eae-a59b-498f97c1b92b/volumes" Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.065241 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-23f7-account-create-update-z479t"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.083194 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-23f7-account-create-update-z479t"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.095668 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-9d98-account-create-update-t77sf"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.104938 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-9d98-account-create-update-t77sf"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.115827 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-452kj"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.124825 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1082-account-create-update-2jjkd"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.133734 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-95vbj"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.143504 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-95vbj"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.153847 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1082-account-create-update-2jjkd"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.163248 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-452kj"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.757480 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa06f21-2d35-4d03-86b9-01d9354826da" path="/var/lib/kubelet/pods/1aa06f21-2d35-4d03-86b9-01d9354826da/volumes" Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.760650 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71459d1c-2acb-4e15-a30d-09dd0f7f7951" path="/var/lib/kubelet/pods/71459d1c-2acb-4e15-a30d-09dd0f7f7951/volumes" Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.763688 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b83b860f-ed6c-46b2-862a-fbda9af7dc89" path="/var/lib/kubelet/pods/b83b860f-ed6c-46b2-862a-fbda9af7dc89/volumes" Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.767591 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bea88065-1eff-42e2-809a-443c15bda0ac" path="/var/lib/kubelet/pods/bea88065-1eff-42e2-809a-443c15bda0ac/volumes" Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.770847 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d" path="/var/lib/kubelet/pods/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d/volumes" Mar 13 14:30:32 crc kubenswrapper[4898]: I0313 14:30:32.035703 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-pxtss"] Mar 13 14:30:32 crc kubenswrapper[4898]: I0313 14:30:32.048037 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-pxtss"] Mar 13 14:30:33 crc kubenswrapper[4898]: I0313 14:30:33.762484 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74eb351d-364c-4564-8f8b-67ac844a6abc" path="/var/lib/kubelet/pods/74eb351d-364c-4564-8f8b-67ac844a6abc/volumes" Mar 13 14:30:34 crc kubenswrapper[4898]: I0313 14:30:34.046697 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-dxpl9"] Mar 13 14:30:34 crc kubenswrapper[4898]: I0313 14:30:34.066729 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-dxpl9"] Mar 13 14:30:35 crc kubenswrapper[4898]: I0313 14:30:35.767553 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fdab36c-41db-4a9c-9cbe-47e1761c6df5" path="/var/lib/kubelet/pods/8fdab36c-41db-4a9c-9cbe-47e1761c6df5/volumes" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.160394 4898 scope.go:117] "RemoveContainer" containerID="f8a3e423bebf88995b6d32dd30a81abd86b4e9eab9359f63d75400abe5906505" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.242057 4898 scope.go:117] "RemoveContainer" containerID="66d662834083b3a8826084dc54618cf384de1f9336d6d06012f43689e8e15545" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.286623 4898 scope.go:117] "RemoveContainer" containerID="860f247abf99986a680fa9cbd71b3ddb7e0e1a4bc671f3a1ca2277312ff69005" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.347367 4898 scope.go:117] "RemoveContainer" containerID="4e72b0c7dc05dd72f43622aacb47475aacbf01f3e30ece85d27e66c47011712e" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.409671 4898 scope.go:117] "RemoveContainer" containerID="190741a9e70699bd53ad4219ca7d5f504afce181f13fef8f81fc17d9e1a70095" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.470070 4898 scope.go:117] "RemoveContainer" containerID="58affd3294e9aec78373844bf6912651079de0e76c0d060a1cf7a048a7bc787d" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.517410 4898 scope.go:117] "RemoveContainer" containerID="7890927e1d3da3f2b5ae266b74631a44cf3eea829b7cc9b79f5ffe9476b7f6a0" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.539382 4898 scope.go:117] "RemoveContainer" containerID="80f19b54532d031429e005ad3c0cf3c32cf378c8ddab795769c6087368772b42" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.562278 4898 scope.go:117] "RemoveContainer" containerID="df7deb06b863e0990f2d81a2c25739f29fd7b5122c0d32c2314963f2204b89f3" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.591401 4898 scope.go:117] "RemoveContainer" containerID="89083a9a998b87f99f34c5645b2e669a886f2f7b20e68795829134c5acb24ac6" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.620840 4898 scope.go:117] "RemoveContainer" containerID="f93c4099691f41a073e964731ace4e3b38e62bd41c90cb8a30591394175252ce" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.665202 4898 scope.go:117] "RemoveContainer" containerID="f7ee82c0bf1917642e2854f8aafb4c28fc338ae0b377633b5ccd89fd1a0294f4" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.695608 4898 scope.go:117] "RemoveContainer" containerID="9199e9b5bfad44aa55bebdeb17820a815d88bcb2d174de10793bf2e6e2845fc2" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.727762 4898 scope.go:117] "RemoveContainer" containerID="f40862770a13b51268c0d1e7b9c2896a6335c6f3b3801074579c313c3d13577e" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.752046 4898 scope.go:117] "RemoveContainer" containerID="592e2145b8382848d105acb5e5275b8dd688df9ab9d3d5caa34a389dd2742086" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.794843 4898 scope.go:117] "RemoveContainer" containerID="81635cb2b5569daf242649d4672547e48f7dd06f334c45c3f33ff7ef44a8ec5f" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.825014 4898 scope.go:117] "RemoveContainer" containerID="e22a5e1114b923439d9f143c3b64b35dbca324bd3cf616ce49fd0caf5c66d873" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.848604 4898 scope.go:117] "RemoveContainer" containerID="e6b8ff442a61a4fcf1565b308b6597fe09fb7264763f211d7540bb2a249e6c54" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.900018 4898 scope.go:117] "RemoveContainer" containerID="a88dfb7f27e61d13c4de942fec09301aec287d6967f6ca991e690f9c9c77a8e1" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.920795 4898 scope.go:117] "RemoveContainer" containerID="4839c26bbb3360becfb71db51ead56738498d1508d530ccce036f032e975f9b4" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.965153 4898 scope.go:117] "RemoveContainer" containerID="3c9e54e03326bbf4751dd95fa7e9b1825e9af82b1eefdf09759081304dd57de9" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.994331 4898 scope.go:117] "RemoveContainer" containerID="9d4412724b9aeab2fb38e3b120d6b80b5959d7a8a33631247f92a023f5b56a70" Mar 13 14:30:38 crc kubenswrapper[4898]: I0313 14:30:38.022303 4898 scope.go:117] "RemoveContainer" containerID="b9cd470c8d031dbf9ec18b998e1bb21c765853e9617c7315b08f8356dad9258f" Mar 13 14:30:38 crc kubenswrapper[4898]: I0313 14:30:38.050319 4898 scope.go:117] "RemoveContainer" containerID="1b7f5b79b9cbc006ae8ac33cf3d709c72ff92b310ff2867c772246e7be6d5aff" Mar 13 14:30:38 crc kubenswrapper[4898]: I0313 14:30:38.075041 4898 scope.go:117] "RemoveContainer" containerID="5f41cf46e25c4bfb4c3cb12738ce8eaa95c275888249cc79416494629ec3b64b" Mar 13 14:30:38 crc kubenswrapper[4898]: I0313 14:30:38.102562 4898 scope.go:117] "RemoveContainer" containerID="c12b1a7614a831f1604c6feaf4bf42c52496fffc128594a26f7d1a8c71f58636" Mar 13 14:30:38 crc kubenswrapper[4898]: I0313 14:30:38.144862 4898 scope.go:117] "RemoveContainer" containerID="72a159d4ac7d8e712d9bffb75add791bdb6f8bee323ccb72d06431a42be68d77" Mar 13 14:30:49 crc kubenswrapper[4898]: I0313 14:30:49.135124 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:30:49 crc kubenswrapper[4898]: I0313 14:30:49.135700 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:30:49 crc kubenswrapper[4898]: I0313 14:30:49.135765 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:30:49 crc kubenswrapper[4898]: I0313 14:30:49.136745 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"544b53b5cdd0293005863b343628de53b83869ce5cd2c798b19c01abba2b5bc8"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:30:49 crc kubenswrapper[4898]: I0313 14:30:49.136831 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://544b53b5cdd0293005863b343628de53b83869ce5cd2c798b19c01abba2b5bc8" gracePeriod=600 Mar 13 14:30:49 crc kubenswrapper[4898]: I0313 14:30:49.412374 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="544b53b5cdd0293005863b343628de53b83869ce5cd2c798b19c01abba2b5bc8" exitCode=0 Mar 13 14:30:49 crc kubenswrapper[4898]: I0313 14:30:49.412572 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"544b53b5cdd0293005863b343628de53b83869ce5cd2c798b19c01abba2b5bc8"} Mar 13 14:30:49 crc kubenswrapper[4898]: I0313 14:30:49.412666 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:30:50 crc kubenswrapper[4898]: I0313 14:30:50.448126 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db"} Mar 13 14:31:06 crc kubenswrapper[4898]: I0313 14:31:06.051008 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-hm77q"] Mar 13 14:31:06 crc kubenswrapper[4898]: I0313 14:31:06.064262 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-hm77q"] Mar 13 14:31:07 crc kubenswrapper[4898]: I0313 14:31:07.759197 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="664deedc-3946-4205-98ad-21759d35d952" path="/var/lib/kubelet/pods/664deedc-3946-4205-98ad-21759d35d952/volumes" Mar 13 14:31:13 crc kubenswrapper[4898]: I0313 14:31:13.041464 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-dddqm"] Mar 13 14:31:13 crc kubenswrapper[4898]: I0313 14:31:13.059487 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-dddqm"] Mar 13 14:31:13 crc kubenswrapper[4898]: I0313 14:31:13.758943 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a3e0c5-0084-4216-a162-3614eafcc162" path="/var/lib/kubelet/pods/51a3e0c5-0084-4216-a162-3614eafcc162/volumes" Mar 13 14:31:16 crc kubenswrapper[4898]: I0313 14:31:16.055231 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ljct7"] Mar 13 14:31:16 crc kubenswrapper[4898]: I0313 14:31:16.070417 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ljct7"] Mar 13 14:31:17 crc kubenswrapper[4898]: I0313 14:31:17.066946 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-xq6ss"] Mar 13 14:31:17 crc kubenswrapper[4898]: I0313 14:31:17.080592 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-xq6ss"] Mar 13 14:31:17 crc kubenswrapper[4898]: I0313 14:31:17.759611 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f68a4dd-fec8-4e60-a89c-69ce09fc5700" path="/var/lib/kubelet/pods/0f68a4dd-fec8-4e60-a89c-69ce09fc5700/volumes" Mar 13 14:31:17 crc kubenswrapper[4898]: I0313 14:31:17.761725 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" path="/var/lib/kubelet/pods/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83/volumes" Mar 13 14:31:30 crc kubenswrapper[4898]: I0313 14:31:30.202233 4898 generic.go:334] "Generic (PLEG): container finished" podID="05e315eb-34b1-4099-b676-b0238f3cb5c5" containerID="19401614e37155924147d2b68ea4d922bec3bbc6d22b8dd77602f823a33552a5" exitCode=0 Mar 13 14:31:30 crc kubenswrapper[4898]: I0313 14:31:30.202264 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" event={"ID":"05e315eb-34b1-4099-b676-b0238f3cb5c5","Type":"ContainerDied","Data":"19401614e37155924147d2b68ea4d922bec3bbc6d22b8dd77602f823a33552a5"} Mar 13 14:31:31 crc kubenswrapper[4898]: I0313 14:31:31.876234 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:31:31 crc kubenswrapper[4898]: I0313 14:31:31.939504 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v49j\" (UniqueName: \"kubernetes.io/projected/05e315eb-34b1-4099-b676-b0238f3cb5c5-kube-api-access-2v49j\") pod \"05e315eb-34b1-4099-b676-b0238f3cb5c5\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " Mar 13 14:31:31 crc kubenswrapper[4898]: I0313 14:31:31.939920 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-inventory\") pod \"05e315eb-34b1-4099-b676-b0238f3cb5c5\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " Mar 13 14:31:31 crc kubenswrapper[4898]: I0313 14:31:31.940071 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-ssh-key-openstack-edpm-ipam\") pod \"05e315eb-34b1-4099-b676-b0238f3cb5c5\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " Mar 13 14:31:31 crc kubenswrapper[4898]: I0313 14:31:31.957500 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05e315eb-34b1-4099-b676-b0238f3cb5c5-kube-api-access-2v49j" (OuterVolumeSpecName: "kube-api-access-2v49j") pod "05e315eb-34b1-4099-b676-b0238f3cb5c5" (UID: "05e315eb-34b1-4099-b676-b0238f3cb5c5"). InnerVolumeSpecName "kube-api-access-2v49j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:31:31 crc kubenswrapper[4898]: I0313 14:31:31.975195 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-inventory" (OuterVolumeSpecName: "inventory") pod "05e315eb-34b1-4099-b676-b0238f3cb5c5" (UID: "05e315eb-34b1-4099-b676-b0238f3cb5c5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:31:31 crc kubenswrapper[4898]: I0313 14:31:31.986228 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "05e315eb-34b1-4099-b676-b0238f3cb5c5" (UID: "05e315eb-34b1-4099-b676-b0238f3cb5c5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.043571 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.043677 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.047495 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v49j\" (UniqueName: \"kubernetes.io/projected/05e315eb-34b1-4099-b676-b0238f3cb5c5-kube-api-access-2v49j\") on node \"crc\" DevicePath \"\"" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.228362 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" event={"ID":"05e315eb-34b1-4099-b676-b0238f3cb5c5","Type":"ContainerDied","Data":"56c75088356ea1f877eddaa52133cb1069449c5912814655f61681146f527468"} Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.228405 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56c75088356ea1f877eddaa52133cb1069449c5912814655f61681146f527468" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.228463 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.327977 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn"] Mar 13 14:31:32 crc kubenswrapper[4898]: E0313 14:31:32.328670 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e315eb-34b1-4099-b676-b0238f3cb5c5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.328700 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e315eb-34b1-4099-b676-b0238f3cb5c5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 14:31:32 crc kubenswrapper[4898]: E0313 14:31:32.328721 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4f21c0b-a6a1-4b44-ae38-4a382569154e" containerName="oc" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.328732 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4f21c0b-a6a1-4b44-ae38-4a382569154e" containerName="oc" Mar 13 14:31:32 crc kubenswrapper[4898]: E0313 14:31:32.328811 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19" containerName="collect-profiles" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.328824 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19" containerName="collect-profiles" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.329204 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4f21c0b-a6a1-4b44-ae38-4a382569154e" containerName="oc" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.329241 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e315eb-34b1-4099-b676-b0238f3cb5c5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.329253 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19" containerName="collect-profiles" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.330387 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.344681 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn"] Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.344780 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.344804 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.345003 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.345007 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.456386 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.456481 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw4q2\" (UniqueName: \"kubernetes.io/projected/295e7c32-75f1-4eee-a126-2d4547c56f24-kube-api-access-pw4q2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.456543 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.557995 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw4q2\" (UniqueName: \"kubernetes.io/projected/295e7c32-75f1-4eee-a126-2d4547c56f24-kube-api-access-pw4q2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.558077 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.558244 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.563545 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.573539 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.573587 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw4q2\" (UniqueName: \"kubernetes.io/projected/295e7c32-75f1-4eee-a126-2d4547c56f24-kube-api-access-pw4q2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.674082 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:33 crc kubenswrapper[4898]: I0313 14:31:33.360221 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn"] Mar 13 14:31:33 crc kubenswrapper[4898]: I0313 14:31:33.365761 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:31:34 crc kubenswrapper[4898]: I0313 14:31:34.250890 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" event={"ID":"295e7c32-75f1-4eee-a126-2d4547c56f24","Type":"ContainerStarted","Data":"dab0dfbc55bfc6f1df629f5c9477ac3d48d09670fceb17a5d6b9747c56387330"} Mar 13 14:31:34 crc kubenswrapper[4898]: I0313 14:31:34.251246 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" event={"ID":"295e7c32-75f1-4eee-a126-2d4547c56f24","Type":"ContainerStarted","Data":"8f16b5e15ed9fbaac5ba0b0d72bb41f2c1ee0e47dc35b9e1544ffaf57639e658"} Mar 13 14:31:34 crc kubenswrapper[4898]: I0313 14:31:34.292569 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" podStartSLOduration=1.868369886 podStartE2EDuration="2.292552014s" podCreationTimestamp="2026-03-13 14:31:32 +0000 UTC" firstStartedPulling="2026-03-13 14:31:33.365485973 +0000 UTC m=+2128.367074222" lastFinishedPulling="2026-03-13 14:31:33.789668111 +0000 UTC m=+2128.791256350" observedRunningTime="2026-03-13 14:31:34.285306285 +0000 UTC m=+2129.286894514" watchObservedRunningTime="2026-03-13 14:31:34.292552014 +0000 UTC m=+2129.294140253" Mar 13 14:31:38 crc kubenswrapper[4898]: I0313 14:31:38.737199 4898 scope.go:117] "RemoveContainer" containerID="dbe55f5873c440c467d8748cfaa995fee6ccd0abf9441bf1f70ed0dda90073d3" Mar 13 14:31:38 crc kubenswrapper[4898]: I0313 14:31:38.769908 4898 scope.go:117] "RemoveContainer" containerID="c66d5e607033edc265c5c4c3b44b5d453515d5500b6db940b367950853043279" Mar 13 14:31:38 crc kubenswrapper[4898]: I0313 14:31:38.795628 4898 scope.go:117] "RemoveContainer" containerID="27760265b5d44dc57e3a3eecff9d010cc5fc5af8472653848b227f366d4e7a49" Mar 13 14:31:38 crc kubenswrapper[4898]: I0313 14:31:38.848190 4898 scope.go:117] "RemoveContainer" containerID="d2f87292a607e1fb76b72fbf0fd5fba62057ee2d194e12f77b3db9510fddedf2" Mar 13 14:31:38 crc kubenswrapper[4898]: I0313 14:31:38.887610 4898 scope.go:117] "RemoveContainer" containerID="d1d45dfadc513f7c3e24c556ad6a45d1d7e050a0da971045c5c1181fc58bc3d2" Mar 13 14:31:38 crc kubenswrapper[4898]: I0313 14:31:38.935747 4898 scope.go:117] "RemoveContainer" containerID="c0655b2adb5618887bc26f0a3bb0d551a636cca41a03c1baf5cc0685920b55bb" Mar 13 14:31:38 crc kubenswrapper[4898]: I0313 14:31:38.981953 4898 scope.go:117] "RemoveContainer" containerID="e195371c387c0ec69bbadee68addfd45715bbfc83433b2ba3c47b307af7325bd" Mar 13 14:31:39 crc kubenswrapper[4898]: I0313 14:31:39.012279 4898 scope.go:117] "RemoveContainer" containerID="ededb2c682deb7693ab4f5295aba59c30d96d3df7afffb98859fdbc9fc5b9e13" Mar 13 14:31:43 crc kubenswrapper[4898]: I0313 14:31:43.059388 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-ztp6c"] Mar 13 14:31:43 crc kubenswrapper[4898]: I0313 14:31:43.077734 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-ztp6c"] Mar 13 14:31:43 crc kubenswrapper[4898]: I0313 14:31:43.760258 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="193b05da-acb9-4512-a2ae-6c03450e6f05" path="/var/lib/kubelet/pods/193b05da-acb9-4512-a2ae-6c03450e6f05/volumes" Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.139738 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556872-2tqn4"] Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.142758 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556872-2tqn4" Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.158557 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556872-2tqn4"] Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.184022 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.184288 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.184483 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.327738 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq98k\" (UniqueName: \"kubernetes.io/projected/8627002c-751e-4168-b294-4a324890a996-kube-api-access-qq98k\") pod \"auto-csr-approver-29556872-2tqn4\" (UID: \"8627002c-751e-4168-b294-4a324890a996\") " pod="openshift-infra/auto-csr-approver-29556872-2tqn4" Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.430982 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq98k\" (UniqueName: \"kubernetes.io/projected/8627002c-751e-4168-b294-4a324890a996-kube-api-access-qq98k\") pod \"auto-csr-approver-29556872-2tqn4\" (UID: \"8627002c-751e-4168-b294-4a324890a996\") " pod="openshift-infra/auto-csr-approver-29556872-2tqn4" Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.465119 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq98k\" (UniqueName: \"kubernetes.io/projected/8627002c-751e-4168-b294-4a324890a996-kube-api-access-qq98k\") pod \"auto-csr-approver-29556872-2tqn4\" (UID: \"8627002c-751e-4168-b294-4a324890a996\") " pod="openshift-infra/auto-csr-approver-29556872-2tqn4" Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.511953 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556872-2tqn4" Mar 13 14:32:01 crc kubenswrapper[4898]: I0313 14:32:01.051547 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556872-2tqn4"] Mar 13 14:32:01 crc kubenswrapper[4898]: W0313 14:32:01.058914 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8627002c_751e_4168_b294_4a324890a996.slice/crio-3e6c8b225663f1d3b600ffeac1d814f793ea23ad97ff301d21f7aa59626ffb92 WatchSource:0}: Error finding container 3e6c8b225663f1d3b600ffeac1d814f793ea23ad97ff301d21f7aa59626ffb92: Status 404 returned error can't find the container with id 3e6c8b225663f1d3b600ffeac1d814f793ea23ad97ff301d21f7aa59626ffb92 Mar 13 14:32:01 crc kubenswrapper[4898]: I0313 14:32:01.619071 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556872-2tqn4" event={"ID":"8627002c-751e-4168-b294-4a324890a996","Type":"ContainerStarted","Data":"3e6c8b225663f1d3b600ffeac1d814f793ea23ad97ff301d21f7aa59626ffb92"} Mar 13 14:32:02 crc kubenswrapper[4898]: I0313 14:32:02.635067 4898 generic.go:334] "Generic (PLEG): container finished" podID="8627002c-751e-4168-b294-4a324890a996" containerID="9918c054f17d0c467592f1c4b30fc11e333ea544aa38b84c1aab31d2beff7c97" exitCode=0 Mar 13 14:32:02 crc kubenswrapper[4898]: I0313 14:32:02.635130 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556872-2tqn4" event={"ID":"8627002c-751e-4168-b294-4a324890a996","Type":"ContainerDied","Data":"9918c054f17d0c467592f1c4b30fc11e333ea544aa38b84c1aab31d2beff7c97"} Mar 13 14:32:04 crc kubenswrapper[4898]: I0313 14:32:04.146134 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556872-2tqn4" Mar 13 14:32:04 crc kubenswrapper[4898]: I0313 14:32:04.341796 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq98k\" (UniqueName: \"kubernetes.io/projected/8627002c-751e-4168-b294-4a324890a996-kube-api-access-qq98k\") pod \"8627002c-751e-4168-b294-4a324890a996\" (UID: \"8627002c-751e-4168-b294-4a324890a996\") " Mar 13 14:32:04 crc kubenswrapper[4898]: I0313 14:32:04.351255 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8627002c-751e-4168-b294-4a324890a996-kube-api-access-qq98k" (OuterVolumeSpecName: "kube-api-access-qq98k") pod "8627002c-751e-4168-b294-4a324890a996" (UID: "8627002c-751e-4168-b294-4a324890a996"). InnerVolumeSpecName "kube-api-access-qq98k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:32:04 crc kubenswrapper[4898]: I0313 14:32:04.445279 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq98k\" (UniqueName: \"kubernetes.io/projected/8627002c-751e-4168-b294-4a324890a996-kube-api-access-qq98k\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:04 crc kubenswrapper[4898]: I0313 14:32:04.675869 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556872-2tqn4" event={"ID":"8627002c-751e-4168-b294-4a324890a996","Type":"ContainerDied","Data":"3e6c8b225663f1d3b600ffeac1d814f793ea23ad97ff301d21f7aa59626ffb92"} Mar 13 14:32:04 crc kubenswrapper[4898]: I0313 14:32:04.676328 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e6c8b225663f1d3b600ffeac1d814f793ea23ad97ff301d21f7aa59626ffb92" Mar 13 14:32:04 crc kubenswrapper[4898]: I0313 14:32:04.675975 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556872-2tqn4" Mar 13 14:32:05 crc kubenswrapper[4898]: I0313 14:32:05.264176 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556866-wnmcp"] Mar 13 14:32:05 crc kubenswrapper[4898]: I0313 14:32:05.291484 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556866-wnmcp"] Mar 13 14:32:05 crc kubenswrapper[4898]: I0313 14:32:05.759845 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45988deb-1057-4d89-a977-35978404b407" path="/var/lib/kubelet/pods/45988deb-1057-4d89-a977-35978404b407/volumes" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.088056 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b9c7-account-create-update-l6h97"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.100703 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-978d-account-create-update-tkc22"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.117281 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qlgbl"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.126036 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-sqtw8"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.134873 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b9c7-account-create-update-l6h97"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.143946 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-978d-account-create-update-tkc22"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.153286 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qlgbl"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.161769 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-sqtw8"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.593098 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9wh62"] Mar 13 14:32:24 crc kubenswrapper[4898]: E0313 14:32:24.593639 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8627002c-751e-4168-b294-4a324890a996" containerName="oc" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.593659 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8627002c-751e-4168-b294-4a324890a996" containerName="oc" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.593953 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8627002c-751e-4168-b294-4a324890a996" containerName="oc" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.596019 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.614655 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9wh62"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.742957 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92zng\" (UniqueName: \"kubernetes.io/projected/3d80d972-f659-45e5-9f39-015848cc4031-kube-api-access-92zng\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.743030 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-utilities\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.743302 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-catalog-content\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.867545 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-catalog-content\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.867642 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92zng\" (UniqueName: \"kubernetes.io/projected/3d80d972-f659-45e5-9f39-015848cc4031-kube-api-access-92zng\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.867678 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-utilities\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.868197 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-catalog-content\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.868214 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-utilities\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.901247 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92zng\" (UniqueName: \"kubernetes.io/projected/3d80d972-f659-45e5-9f39-015848cc4031-kube-api-access-92zng\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.915846 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.049830 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-zhx84"] Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.066952 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-zhx84"] Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.086673 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-96c4-account-create-update-zsh5t"] Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.095914 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-96c4-account-create-update-zsh5t"] Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.515326 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9wh62"] Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.766879 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="068b0856-126d-487c-9c1d-50299bf90d3a" path="/var/lib/kubelet/pods/068b0856-126d-487c-9c1d-50299bf90d3a/volumes" Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.769070 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1abedb18-bf27-42d9-b809-f7226b603a0d" path="/var/lib/kubelet/pods/1abedb18-bf27-42d9-b809-f7226b603a0d/volumes" Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.770758 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29dbeb8a-611d-4513-a063-06d8f865ea93" path="/var/lib/kubelet/pods/29dbeb8a-611d-4513-a063-06d8f865ea93/volumes" Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.779718 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f1f531-99d1-4b97-bd08-6bf94a7afd92" path="/var/lib/kubelet/pods/44f1f531-99d1-4b97-bd08-6bf94a7afd92/volumes" Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.781773 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485200a5-cd75-45ac-b93a-b003158132c4" path="/var/lib/kubelet/pods/485200a5-cd75-45ac-b93a-b003158132c4/volumes" Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.783583 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e516311e-fb5c-4901-aaf7-67793ffb5fa2" path="/var/lib/kubelet/pods/e516311e-fb5c-4901-aaf7-67793ffb5fa2/volumes" Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.974649 4898 generic.go:334] "Generic (PLEG): container finished" podID="3d80d972-f659-45e5-9f39-015848cc4031" containerID="3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa" exitCode=0 Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.974708 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wh62" event={"ID":"3d80d972-f659-45e5-9f39-015848cc4031","Type":"ContainerDied","Data":"3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa"} Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.974756 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wh62" event={"ID":"3d80d972-f659-45e5-9f39-015848cc4031","Type":"ContainerStarted","Data":"5b1d9d4e27dd0c848be0c41de4a99a6de77e56cfa4171f17318b70a2ad74984c"} Mar 13 14:32:26 crc kubenswrapper[4898]: I0313 14:32:26.989117 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wh62" event={"ID":"3d80d972-f659-45e5-9f39-015848cc4031","Type":"ContainerStarted","Data":"ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110"} Mar 13 14:32:29 crc kubenswrapper[4898]: I0313 14:32:29.016791 4898 generic.go:334] "Generic (PLEG): container finished" podID="3d80d972-f659-45e5-9f39-015848cc4031" containerID="ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110" exitCode=0 Mar 13 14:32:29 crc kubenswrapper[4898]: I0313 14:32:29.016928 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wh62" event={"ID":"3d80d972-f659-45e5-9f39-015848cc4031","Type":"ContainerDied","Data":"ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110"} Mar 13 14:32:30 crc kubenswrapper[4898]: I0313 14:32:30.029355 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wh62" event={"ID":"3d80d972-f659-45e5-9f39-015848cc4031","Type":"ContainerStarted","Data":"30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1"} Mar 13 14:32:30 crc kubenswrapper[4898]: I0313 14:32:30.059660 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9wh62" podStartSLOduration=2.575778135 podStartE2EDuration="6.059628511s" podCreationTimestamp="2026-03-13 14:32:24 +0000 UTC" firstStartedPulling="2026-03-13 14:32:25.977295276 +0000 UTC m=+2180.978883515" lastFinishedPulling="2026-03-13 14:32:29.461145642 +0000 UTC m=+2184.462733891" observedRunningTime="2026-03-13 14:32:30.051849259 +0000 UTC m=+2185.053437528" watchObservedRunningTime="2026-03-13 14:32:30.059628511 +0000 UTC m=+2185.061216770" Mar 13 14:32:34 crc kubenswrapper[4898]: I0313 14:32:34.916401 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:34 crc kubenswrapper[4898]: I0313 14:32:34.916982 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:35 crc kubenswrapper[4898]: I0313 14:32:35.985865 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9wh62" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="registry-server" probeResult="failure" output=< Mar 13 14:32:35 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:32:35 crc kubenswrapper[4898]: > Mar 13 14:32:39 crc kubenswrapper[4898]: I0313 14:32:39.172960 4898 scope.go:117] "RemoveContainer" containerID="8dc6d76d86edf4ca1fbe4589c8cd285e45e3e3a632208ca7817a0099916d678e" Mar 13 14:32:39 crc kubenswrapper[4898]: I0313 14:32:39.203993 4898 scope.go:117] "RemoveContainer" containerID="7b608b8ed7e8f0c428a761cc552755850cceca5a0b694b7beffed50aa397bd7c" Mar 13 14:32:39 crc kubenswrapper[4898]: I0313 14:32:39.287716 4898 scope.go:117] "RemoveContainer" containerID="bf99d1df7658057773b414b4c2a04b114eb5afc6efdb7e3669f974780f737076" Mar 13 14:32:39 crc kubenswrapper[4898]: I0313 14:32:39.339121 4898 scope.go:117] "RemoveContainer" containerID="81721ca65e97448cfc7621215d98c3ea95987d960df48d9fdaef1b644754dcaf" Mar 13 14:32:39 crc kubenswrapper[4898]: I0313 14:32:39.431193 4898 scope.go:117] "RemoveContainer" containerID="388808695f9ab04c5365cf0b1e925d316f5fa781ee2d5e7df5edc4ae9b9090f3" Mar 13 14:32:39 crc kubenswrapper[4898]: I0313 14:32:39.458695 4898 scope.go:117] "RemoveContainer" containerID="990c87abba6cb97ddcc9ff9fc6fc009eeddad1ca5f265916404c317707eccb6f" Mar 13 14:32:39 crc kubenswrapper[4898]: I0313 14:32:39.527070 4898 scope.go:117] "RemoveContainer" containerID="c911be7c2d6d8f32598481ced3a29ce9fc65efe653b0696b937918e79b814d51" Mar 13 14:32:39 crc kubenswrapper[4898]: I0313 14:32:39.550252 4898 scope.go:117] "RemoveContainer" containerID="bb0c3bbe6081c4840c08e83355257eccaf898a1f20509cda6e941396919762d8" Mar 13 14:32:40 crc kubenswrapper[4898]: I0313 14:32:40.141960 4898 generic.go:334] "Generic (PLEG): container finished" podID="295e7c32-75f1-4eee-a126-2d4547c56f24" containerID="dab0dfbc55bfc6f1df629f5c9477ac3d48d09670fceb17a5d6b9747c56387330" exitCode=0 Mar 13 14:32:40 crc kubenswrapper[4898]: I0313 14:32:40.142008 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" event={"ID":"295e7c32-75f1-4eee-a126-2d4547c56f24","Type":"ContainerDied","Data":"dab0dfbc55bfc6f1df629f5c9477ac3d48d09670fceb17a5d6b9747c56387330"} Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.721869 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.839570 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw4q2\" (UniqueName: \"kubernetes.io/projected/295e7c32-75f1-4eee-a126-2d4547c56f24-kube-api-access-pw4q2\") pod \"295e7c32-75f1-4eee-a126-2d4547c56f24\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.839911 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-inventory\") pod \"295e7c32-75f1-4eee-a126-2d4547c56f24\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.840011 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-ssh-key-openstack-edpm-ipam\") pod \"295e7c32-75f1-4eee-a126-2d4547c56f24\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.846714 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295e7c32-75f1-4eee-a126-2d4547c56f24-kube-api-access-pw4q2" (OuterVolumeSpecName: "kube-api-access-pw4q2") pod "295e7c32-75f1-4eee-a126-2d4547c56f24" (UID: "295e7c32-75f1-4eee-a126-2d4547c56f24"). InnerVolumeSpecName "kube-api-access-pw4q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.873125 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-inventory" (OuterVolumeSpecName: "inventory") pod "295e7c32-75f1-4eee-a126-2d4547c56f24" (UID: "295e7c32-75f1-4eee-a126-2d4547c56f24"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.881308 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "295e7c32-75f1-4eee-a126-2d4547c56f24" (UID: "295e7c32-75f1-4eee-a126-2d4547c56f24"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.943451 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.943488 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.943499 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw4q2\" (UniqueName: \"kubernetes.io/projected/295e7c32-75f1-4eee-a126-2d4547c56f24-kube-api-access-pw4q2\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.169733 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" event={"ID":"295e7c32-75f1-4eee-a126-2d4547c56f24","Type":"ContainerDied","Data":"8f16b5e15ed9fbaac5ba0b0d72bb41f2c1ee0e47dc35b9e1544ffaf57639e658"} Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.169782 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f16b5e15ed9fbaac5ba0b0d72bb41f2c1ee0e47dc35b9e1544ffaf57639e658" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.169807 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.329586 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c"] Mar 13 14:32:42 crc kubenswrapper[4898]: E0313 14:32:42.330717 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295e7c32-75f1-4eee-a126-2d4547c56f24" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.330832 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="295e7c32-75f1-4eee-a126-2d4547c56f24" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.331216 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="295e7c32-75f1-4eee-a126-2d4547c56f24" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.332344 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.334363 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.336421 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.336470 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.337270 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.348109 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c"] Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.458102 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.458148 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wj8f\" (UniqueName: \"kubernetes.io/projected/271e9163-4e9c-4c79-a0b4-be373e97956c-kube-api-access-4wj8f\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.459037 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.561655 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.561730 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.561759 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wj8f\" (UniqueName: \"kubernetes.io/projected/271e9163-4e9c-4c79-a0b4-be373e97956c-kube-api-access-4wj8f\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.567720 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.568130 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.579689 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wj8f\" (UniqueName: \"kubernetes.io/projected/271e9163-4e9c-4c79-a0b4-be373e97956c-kube-api-access-4wj8f\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.660116 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:43 crc kubenswrapper[4898]: W0313 14:32:43.359785 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod271e9163_4e9c_4c79_a0b4_be373e97956c.slice/crio-4462e1eeacdaca47ef14f2208ad82e41bbd408e210fbc71a917708ba15842dcc WatchSource:0}: Error finding container 4462e1eeacdaca47ef14f2208ad82e41bbd408e210fbc71a917708ba15842dcc: Status 404 returned error can't find the container with id 4462e1eeacdaca47ef14f2208ad82e41bbd408e210fbc71a917708ba15842dcc Mar 13 14:32:43 crc kubenswrapper[4898]: I0313 14:32:43.363788 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c"] Mar 13 14:32:44 crc kubenswrapper[4898]: I0313 14:32:44.194698 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" event={"ID":"271e9163-4e9c-4c79-a0b4-be373e97956c","Type":"ContainerStarted","Data":"9a2d91dd6fb576c0d414bdb1108d49b85beeded77fb65ea17ffcfe9b08112c4c"} Mar 13 14:32:44 crc kubenswrapper[4898]: I0313 14:32:44.195315 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" event={"ID":"271e9163-4e9c-4c79-a0b4-be373e97956c","Type":"ContainerStarted","Data":"4462e1eeacdaca47ef14f2208ad82e41bbd408e210fbc71a917708ba15842dcc"} Mar 13 14:32:44 crc kubenswrapper[4898]: I0313 14:32:44.228070 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" podStartSLOduration=1.764817643 podStartE2EDuration="2.228040034s" podCreationTimestamp="2026-03-13 14:32:42 +0000 UTC" firstStartedPulling="2026-03-13 14:32:43.36269529 +0000 UTC m=+2198.364283529" lastFinishedPulling="2026-03-13 14:32:43.825917641 +0000 UTC m=+2198.827505920" observedRunningTime="2026-03-13 14:32:44.211055513 +0000 UTC m=+2199.212643792" watchObservedRunningTime="2026-03-13 14:32:44.228040034 +0000 UTC m=+2199.229628313" Mar 13 14:32:44 crc kubenswrapper[4898]: I0313 14:32:44.982502 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:45 crc kubenswrapper[4898]: I0313 14:32:45.055159 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:45 crc kubenswrapper[4898]: I0313 14:32:45.226226 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9wh62"] Mar 13 14:32:46 crc kubenswrapper[4898]: I0313 14:32:46.216218 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9wh62" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="registry-server" containerID="cri-o://30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1" gracePeriod=2 Mar 13 14:32:46 crc kubenswrapper[4898]: I0313 14:32:46.828717 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:46 crc kubenswrapper[4898]: I0313 14:32:46.976441 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92zng\" (UniqueName: \"kubernetes.io/projected/3d80d972-f659-45e5-9f39-015848cc4031-kube-api-access-92zng\") pod \"3d80d972-f659-45e5-9f39-015848cc4031\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " Mar 13 14:32:46 crc kubenswrapper[4898]: I0313 14:32:46.976552 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-catalog-content\") pod \"3d80d972-f659-45e5-9f39-015848cc4031\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " Mar 13 14:32:46 crc kubenswrapper[4898]: I0313 14:32:46.976595 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-utilities\") pod \"3d80d972-f659-45e5-9f39-015848cc4031\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " Mar 13 14:32:46 crc kubenswrapper[4898]: I0313 14:32:46.977521 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-utilities" (OuterVolumeSpecName: "utilities") pod "3d80d972-f659-45e5-9f39-015848cc4031" (UID: "3d80d972-f659-45e5-9f39-015848cc4031"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:32:46 crc kubenswrapper[4898]: I0313 14:32:46.982605 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d80d972-f659-45e5-9f39-015848cc4031-kube-api-access-92zng" (OuterVolumeSpecName: "kube-api-access-92zng") pod "3d80d972-f659-45e5-9f39-015848cc4031" (UID: "3d80d972-f659-45e5-9f39-015848cc4031"). InnerVolumeSpecName "kube-api-access-92zng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.041620 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d80d972-f659-45e5-9f39-015848cc4031" (UID: "3d80d972-f659-45e5-9f39-015848cc4031"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.079153 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92zng\" (UniqueName: \"kubernetes.io/projected/3d80d972-f659-45e5-9f39-015848cc4031-kube-api-access-92zng\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.079190 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.079199 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.226930 4898 generic.go:334] "Generic (PLEG): container finished" podID="3d80d972-f659-45e5-9f39-015848cc4031" containerID="30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1" exitCode=0 Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.226995 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.227014 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wh62" event={"ID":"3d80d972-f659-45e5-9f39-015848cc4031","Type":"ContainerDied","Data":"30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1"} Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.228203 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wh62" event={"ID":"3d80d972-f659-45e5-9f39-015848cc4031","Type":"ContainerDied","Data":"5b1d9d4e27dd0c848be0c41de4a99a6de77e56cfa4171f17318b70a2ad74984c"} Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.228236 4898 scope.go:117] "RemoveContainer" containerID="30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.265846 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9wh62"] Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.275652 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9wh62"] Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.278570 4898 scope.go:117] "RemoveContainer" containerID="ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.304117 4898 scope.go:117] "RemoveContainer" containerID="3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.365302 4898 scope.go:117] "RemoveContainer" containerID="30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1" Mar 13 14:32:47 crc kubenswrapper[4898]: E0313 14:32:47.365830 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1\": container with ID starting with 30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1 not found: ID does not exist" containerID="30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.365862 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1"} err="failed to get container status \"30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1\": rpc error: code = NotFound desc = could not find container \"30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1\": container with ID starting with 30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1 not found: ID does not exist" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.365883 4898 scope.go:117] "RemoveContainer" containerID="ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110" Mar 13 14:32:47 crc kubenswrapper[4898]: E0313 14:32:47.366510 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110\": container with ID starting with ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110 not found: ID does not exist" containerID="ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.366557 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110"} err="failed to get container status \"ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110\": rpc error: code = NotFound desc = could not find container \"ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110\": container with ID starting with ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110 not found: ID does not exist" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.366589 4898 scope.go:117] "RemoveContainer" containerID="3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa" Mar 13 14:32:47 crc kubenswrapper[4898]: E0313 14:32:47.367047 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa\": container with ID starting with 3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa not found: ID does not exist" containerID="3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.367134 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa"} err="failed to get container status \"3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa\": rpc error: code = NotFound desc = could not find container \"3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa\": container with ID starting with 3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa not found: ID does not exist" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.752228 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d80d972-f659-45e5-9f39-015848cc4031" path="/var/lib/kubelet/pods/3d80d972-f659-45e5-9f39-015848cc4031/volumes" Mar 13 14:32:49 crc kubenswrapper[4898]: I0313 14:32:49.134310 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:32:49 crc kubenswrapper[4898]: I0313 14:32:49.134674 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:32:49 crc kubenswrapper[4898]: I0313 14:32:49.254795 4898 generic.go:334] "Generic (PLEG): container finished" podID="271e9163-4e9c-4c79-a0b4-be373e97956c" containerID="9a2d91dd6fb576c0d414bdb1108d49b85beeded77fb65ea17ffcfe9b08112c4c" exitCode=0 Mar 13 14:32:49 crc kubenswrapper[4898]: I0313 14:32:49.254839 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" event={"ID":"271e9163-4e9c-4c79-a0b4-be373e97956c","Type":"ContainerDied","Data":"9a2d91dd6fb576c0d414bdb1108d49b85beeded77fb65ea17ffcfe9b08112c4c"} Mar 13 14:32:50 crc kubenswrapper[4898]: I0313 14:32:50.851124 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:50 crc kubenswrapper[4898]: I0313 14:32:50.993304 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-ssh-key-openstack-edpm-ipam\") pod \"271e9163-4e9c-4c79-a0b4-be373e97956c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " Mar 13 14:32:50 crc kubenswrapper[4898]: I0313 14:32:50.993357 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-inventory\") pod \"271e9163-4e9c-4c79-a0b4-be373e97956c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " Mar 13 14:32:50 crc kubenswrapper[4898]: I0313 14:32:50.993427 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wj8f\" (UniqueName: \"kubernetes.io/projected/271e9163-4e9c-4c79-a0b4-be373e97956c-kube-api-access-4wj8f\") pod \"271e9163-4e9c-4c79-a0b4-be373e97956c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.003214 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/271e9163-4e9c-4c79-a0b4-be373e97956c-kube-api-access-4wj8f" (OuterVolumeSpecName: "kube-api-access-4wj8f") pod "271e9163-4e9c-4c79-a0b4-be373e97956c" (UID: "271e9163-4e9c-4c79-a0b4-be373e97956c"). InnerVolumeSpecName "kube-api-access-4wj8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.032357 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-inventory" (OuterVolumeSpecName: "inventory") pod "271e9163-4e9c-4c79-a0b4-be373e97956c" (UID: "271e9163-4e9c-4c79-a0b4-be373e97956c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.038651 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "271e9163-4e9c-4c79-a0b4-be373e97956c" (UID: "271e9163-4e9c-4c79-a0b4-be373e97956c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.096601 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.096635 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.096644 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wj8f\" (UniqueName: \"kubernetes.io/projected/271e9163-4e9c-4c79-a0b4-be373e97956c-kube-api-access-4wj8f\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.281267 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" event={"ID":"271e9163-4e9c-4c79-a0b4-be373e97956c","Type":"ContainerDied","Data":"4462e1eeacdaca47ef14f2208ad82e41bbd408e210fbc71a917708ba15842dcc"} Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.281320 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4462e1eeacdaca47ef14f2208ad82e41bbd408e210fbc71a917708ba15842dcc" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.281335 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.362479 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644"] Mar 13 14:32:51 crc kubenswrapper[4898]: E0313 14:32:51.363128 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="registry-server" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.363149 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="registry-server" Mar 13 14:32:51 crc kubenswrapper[4898]: E0313 14:32:51.363180 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="extract-content" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.363187 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="extract-content" Mar 13 14:32:51 crc kubenswrapper[4898]: E0313 14:32:51.363224 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="extract-utilities" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.363232 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="extract-utilities" Mar 13 14:32:51 crc kubenswrapper[4898]: E0313 14:32:51.363261 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="271e9163-4e9c-4c79-a0b4-be373e97956c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.363270 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="271e9163-4e9c-4c79-a0b4-be373e97956c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.363575 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="registry-server" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.363622 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="271e9163-4e9c-4c79-a0b4-be373e97956c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.364732 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.366868 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.367049 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.367259 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.368680 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.374792 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644"] Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.508993 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szptc\" (UniqueName: \"kubernetes.io/projected/8b4abb6a-5797-47be-96a0-69173649e5fa-kube-api-access-szptc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.509438 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.509837 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.612661 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.612727 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szptc\" (UniqueName: \"kubernetes.io/projected/8b4abb6a-5797-47be-96a0-69173649e5fa-kube-api-access-szptc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.612789 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.616608 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.616915 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.631036 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szptc\" (UniqueName: \"kubernetes.io/projected/8b4abb6a-5797-47be-96a0-69173649e5fa-kube-api-access-szptc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.703797 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:52 crc kubenswrapper[4898]: I0313 14:32:52.447573 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644"] Mar 13 14:32:53 crc kubenswrapper[4898]: I0313 14:32:53.303602 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" event={"ID":"8b4abb6a-5797-47be-96a0-69173649e5fa","Type":"ContainerStarted","Data":"4c566f53698dc8ae77e0ae8b01f4afc7b8880694d922ac5eedd84e461b0b86f6"} Mar 13 14:32:53 crc kubenswrapper[4898]: I0313 14:32:53.304059 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" event={"ID":"8b4abb6a-5797-47be-96a0-69173649e5fa","Type":"ContainerStarted","Data":"f1afaa6de53cd09356160f6667e20e603147bc41c6c165a1557006dc66536ae1"} Mar 13 14:32:53 crc kubenswrapper[4898]: I0313 14:32:53.332504 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" podStartSLOduration=1.8176499019999999 podStartE2EDuration="2.332477537s" podCreationTimestamp="2026-03-13 14:32:51 +0000 UTC" firstStartedPulling="2026-03-13 14:32:52.468415668 +0000 UTC m=+2207.470003927" lastFinishedPulling="2026-03-13 14:32:52.983243283 +0000 UTC m=+2207.984831562" observedRunningTime="2026-03-13 14:32:53.323033799 +0000 UTC m=+2208.324622028" watchObservedRunningTime="2026-03-13 14:32:53.332477537 +0000 UTC m=+2208.334065786" Mar 13 14:32:58 crc kubenswrapper[4898]: I0313 14:32:58.052144 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qps2v"] Mar 13 14:32:58 crc kubenswrapper[4898]: I0313 14:32:58.064281 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qps2v"] Mar 13 14:32:59 crc kubenswrapper[4898]: I0313 14:32:59.752796 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eba407c-68a5-45e9-ab51-e8cba05d8559" path="/var/lib/kubelet/pods/7eba407c-68a5-45e9-ab51-e8cba05d8559/volumes" Mar 13 14:33:19 crc kubenswrapper[4898]: I0313 14:33:19.135163 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:33:19 crc kubenswrapper[4898]: I0313 14:33:19.135840 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:33:20 crc kubenswrapper[4898]: I0313 14:33:20.060464 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-76lrv"] Mar 13 14:33:20 crc kubenswrapper[4898]: I0313 14:33:20.083867 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-llbn5"] Mar 13 14:33:20 crc kubenswrapper[4898]: I0313 14:33:20.096887 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-76lrv"] Mar 13 14:33:20 crc kubenswrapper[4898]: I0313 14:33:20.110380 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-llbn5"] Mar 13 14:33:21 crc kubenswrapper[4898]: I0313 14:33:21.757868 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04183e35-79b0-4c76-b538-b5b71299cd92" path="/var/lib/kubelet/pods/04183e35-79b0-4c76-b538-b5b71299cd92/volumes" Mar 13 14:33:21 crc kubenswrapper[4898]: I0313 14:33:21.759479 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bff908e4-09f4-490b-9b9c-ef65c6224eeb" path="/var/lib/kubelet/pods/bff908e4-09f4-490b-9b9c-ef65c6224eeb/volumes" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.432096 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mt6t6"] Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.436466 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.449486 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mt6t6"] Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.517798 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-catalog-content\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.517871 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68xrp\" (UniqueName: \"kubernetes.io/projected/5a1116c6-c423-4585-af50-c9ecdca3720e-kube-api-access-68xrp\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.518458 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-utilities\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.623038 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-utilities\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.623630 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-catalog-content\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.623686 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68xrp\" (UniqueName: \"kubernetes.io/projected/5a1116c6-c423-4585-af50-c9ecdca3720e-kube-api-access-68xrp\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.624891 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-utilities\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.625232 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-catalog-content\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.646417 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68xrp\" (UniqueName: \"kubernetes.io/projected/5a1116c6-c423-4585-af50-c9ecdca3720e-kube-api-access-68xrp\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.779887 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:29 crc kubenswrapper[4898]: I0313 14:33:29.327350 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mt6t6"] Mar 13 14:33:30 crc kubenswrapper[4898]: I0313 14:33:30.019206 4898 generic.go:334] "Generic (PLEG): container finished" podID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerID="affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210" exitCode=0 Mar 13 14:33:30 crc kubenswrapper[4898]: I0313 14:33:30.019260 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt6t6" event={"ID":"5a1116c6-c423-4585-af50-c9ecdca3720e","Type":"ContainerDied","Data":"affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210"} Mar 13 14:33:30 crc kubenswrapper[4898]: I0313 14:33:30.019445 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt6t6" event={"ID":"5a1116c6-c423-4585-af50-c9ecdca3720e","Type":"ContainerStarted","Data":"b30638434c0fe439393ecdc839cda22c240c59580f70c0f1734ebb6f4ce66486"} Mar 13 14:33:31 crc kubenswrapper[4898]: I0313 14:33:31.038596 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt6t6" event={"ID":"5a1116c6-c423-4585-af50-c9ecdca3720e","Type":"ContainerStarted","Data":"7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916"} Mar 13 14:33:32 crc kubenswrapper[4898]: I0313 14:33:32.052762 4898 generic.go:334] "Generic (PLEG): container finished" podID="8b4abb6a-5797-47be-96a0-69173649e5fa" containerID="4c566f53698dc8ae77e0ae8b01f4afc7b8880694d922ac5eedd84e461b0b86f6" exitCode=0 Mar 13 14:33:32 crc kubenswrapper[4898]: I0313 14:33:32.052816 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" event={"ID":"8b4abb6a-5797-47be-96a0-69173649e5fa","Type":"ContainerDied","Data":"4c566f53698dc8ae77e0ae8b01f4afc7b8880694d922ac5eedd84e461b0b86f6"} Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.676677 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.769423 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-ssh-key-openstack-edpm-ipam\") pod \"8b4abb6a-5797-47be-96a0-69173649e5fa\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.769669 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szptc\" (UniqueName: \"kubernetes.io/projected/8b4abb6a-5797-47be-96a0-69173649e5fa-kube-api-access-szptc\") pod \"8b4abb6a-5797-47be-96a0-69173649e5fa\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.769797 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-inventory\") pod \"8b4abb6a-5797-47be-96a0-69173649e5fa\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.776178 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b4abb6a-5797-47be-96a0-69173649e5fa-kube-api-access-szptc" (OuterVolumeSpecName: "kube-api-access-szptc") pod "8b4abb6a-5797-47be-96a0-69173649e5fa" (UID: "8b4abb6a-5797-47be-96a0-69173649e5fa"). InnerVolumeSpecName "kube-api-access-szptc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.807762 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8b4abb6a-5797-47be-96a0-69173649e5fa" (UID: "8b4abb6a-5797-47be-96a0-69173649e5fa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.812972 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-inventory" (OuterVolumeSpecName: "inventory") pod "8b4abb6a-5797-47be-96a0-69173649e5fa" (UID: "8b4abb6a-5797-47be-96a0-69173649e5fa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.873794 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.874913 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szptc\" (UniqueName: \"kubernetes.io/projected/8b4abb6a-5797-47be-96a0-69173649e5fa-kube-api-access-szptc\") on node \"crc\" DevicePath \"\"" Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.874932 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.075709 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" event={"ID":"8b4abb6a-5797-47be-96a0-69173649e5fa","Type":"ContainerDied","Data":"f1afaa6de53cd09356160f6667e20e603147bc41c6c165a1557006dc66536ae1"} Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.075756 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1afaa6de53cd09356160f6667e20e603147bc41c6c165a1557006dc66536ae1" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.075856 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.210697 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz"] Mar 13 14:33:34 crc kubenswrapper[4898]: E0313 14:33:34.211433 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b4abb6a-5797-47be-96a0-69173649e5fa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.211470 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b4abb6a-5797-47be-96a0-69173649e5fa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.211787 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b4abb6a-5797-47be-96a0-69173649e5fa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.212946 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.215922 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.215971 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.216002 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.215926 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.226844 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz"] Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.286114 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.286215 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.286249 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsqr5\" (UniqueName: \"kubernetes.io/projected/ac094822-6272-4730-ab0b-16f0116426b5-kube-api-access-zsqr5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.389786 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.389933 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.389978 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsqr5\" (UniqueName: \"kubernetes.io/projected/ac094822-6272-4730-ab0b-16f0116426b5-kube-api-access-zsqr5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.396130 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.398334 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.414243 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsqr5\" (UniqueName: \"kubernetes.io/projected/ac094822-6272-4730-ab0b-16f0116426b5-kube-api-access-zsqr5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.538528 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:35 crc kubenswrapper[4898]: I0313 14:33:35.068796 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz"] Mar 13 14:33:35 crc kubenswrapper[4898]: W0313 14:33:35.076759 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac094822_6272_4730_ab0b_16f0116426b5.slice/crio-704f436dc38746438fb453814824552f6f864e1b290a23b87caa2e264e34538d WatchSource:0}: Error finding container 704f436dc38746438fb453814824552f6f864e1b290a23b87caa2e264e34538d: Status 404 returned error can't find the container with id 704f436dc38746438fb453814824552f6f864e1b290a23b87caa2e264e34538d Mar 13 14:33:36 crc kubenswrapper[4898]: I0313 14:33:36.033884 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-crnr4"] Mar 13 14:33:36 crc kubenswrapper[4898]: I0313 14:33:36.047366 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-crnr4"] Mar 13 14:33:36 crc kubenswrapper[4898]: I0313 14:33:36.101050 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" event={"ID":"ac094822-6272-4730-ab0b-16f0116426b5","Type":"ContainerStarted","Data":"75351a338e412d11bdcadf255cb40c23212372a818b23c31d819578fbd7526fe"} Mar 13 14:33:36 crc kubenswrapper[4898]: I0313 14:33:36.101121 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" event={"ID":"ac094822-6272-4730-ab0b-16f0116426b5","Type":"ContainerStarted","Data":"704f436dc38746438fb453814824552f6f864e1b290a23b87caa2e264e34538d"} Mar 13 14:33:36 crc kubenswrapper[4898]: I0313 14:33:36.125883 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" podStartSLOduration=1.581065421 podStartE2EDuration="2.125857591s" podCreationTimestamp="2026-03-13 14:33:34 +0000 UTC" firstStartedPulling="2026-03-13 14:33:35.085177975 +0000 UTC m=+2250.086766224" lastFinishedPulling="2026-03-13 14:33:35.629970145 +0000 UTC m=+2250.631558394" observedRunningTime="2026-03-13 14:33:36.121334367 +0000 UTC m=+2251.122922616" watchObservedRunningTime="2026-03-13 14:33:36.125857591 +0000 UTC m=+2251.127445840" Mar 13 14:33:37 crc kubenswrapper[4898]: I0313 14:33:37.037876 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-fc6a-account-create-update-cfl2q"] Mar 13 14:33:37 crc kubenswrapper[4898]: I0313 14:33:37.060014 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-fc6a-account-create-update-cfl2q"] Mar 13 14:33:37 crc kubenswrapper[4898]: I0313 14:33:37.115685 4898 generic.go:334] "Generic (PLEG): container finished" podID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerID="7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916" exitCode=0 Mar 13 14:33:37 crc kubenswrapper[4898]: I0313 14:33:37.115759 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt6t6" event={"ID":"5a1116c6-c423-4585-af50-c9ecdca3720e","Type":"ContainerDied","Data":"7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916"} Mar 13 14:33:37 crc kubenswrapper[4898]: I0313 14:33:37.765019 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d29378e-424d-4831-baf4-b59a75072097" path="/var/lib/kubelet/pods/4d29378e-424d-4831-baf4-b59a75072097/volumes" Mar 13 14:33:37 crc kubenswrapper[4898]: I0313 14:33:37.766242 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e50eec10-99ce-4611-8cf4-8f4999146339" path="/var/lib/kubelet/pods/e50eec10-99ce-4611-8cf4-8f4999146339/volumes" Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.131186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt6t6" event={"ID":"5a1116c6-c423-4585-af50-c9ecdca3720e","Type":"ContainerStarted","Data":"14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f"} Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.170548 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mt6t6" podStartSLOduration=2.519290723 podStartE2EDuration="10.170520487s" podCreationTimestamp="2026-03-13 14:33:28 +0000 UTC" firstStartedPulling="2026-03-13 14:33:30.022206457 +0000 UTC m=+2245.023794696" lastFinishedPulling="2026-03-13 14:33:37.673436181 +0000 UTC m=+2252.675024460" observedRunningTime="2026-03-13 14:33:38.158832872 +0000 UTC m=+2253.160421131" watchObservedRunningTime="2026-03-13 14:33:38.170520487 +0000 UTC m=+2253.172108736" Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.781461 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.781860 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.816181 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-88s69"] Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.822689 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.837027 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-88s69"] Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.921617 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-utilities\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.921722 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncrlc\" (UniqueName: \"kubernetes.io/projected/0528f01c-62c6-4665-9b64-b20182ed6aad-kube-api-access-ncrlc\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.921784 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-catalog-content\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.024513 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-catalog-content\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.024703 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-utilities\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.024754 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncrlc\" (UniqueName: \"kubernetes.io/projected/0528f01c-62c6-4665-9b64-b20182ed6aad-kube-api-access-ncrlc\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.025088 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-catalog-content\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.025769 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-utilities\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.053642 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncrlc\" (UniqueName: \"kubernetes.io/projected/0528f01c-62c6-4665-9b64-b20182ed6aad-kube-api-access-ncrlc\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.167159 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.674418 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-88s69"] Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.745702 4898 scope.go:117] "RemoveContainer" containerID="4c3017aeb6114b905e5e159773ee55011a6eea4d13e889167f127e1524ada66d" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.814672 4898 scope.go:117] "RemoveContainer" containerID="74e75aa197a91664f89edd48f01ff5c813660e7743cf922315f4b6a18e19c506" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.839799 4898 scope.go:117] "RemoveContainer" containerID="903eebacfbe4709488e6b56c6ba47deec7c9d806d35c4763b770e46f79ef165a" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.883491 4898 scope.go:117] "RemoveContainer" containerID="d1ff8d0ca102a074d68ee12cd37ffd04a070a172037a36f9afafa4bd84128371" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.886579 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mt6t6" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="registry-server" probeResult="failure" output=< Mar 13 14:33:39 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:33:39 crc kubenswrapper[4898]: > Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.938805 4898 scope.go:117] "RemoveContainer" containerID="7e9f2307a91699c726a3f93d044663fb844450acd6da5dd38c51549451b97bc8" Mar 13 14:33:40 crc kubenswrapper[4898]: I0313 14:33:40.153138 4898 generic.go:334] "Generic (PLEG): container finished" podID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerID="1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81" exitCode=0 Mar 13 14:33:40 crc kubenswrapper[4898]: I0313 14:33:40.153257 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s69" event={"ID":"0528f01c-62c6-4665-9b64-b20182ed6aad","Type":"ContainerDied","Data":"1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81"} Mar 13 14:33:40 crc kubenswrapper[4898]: I0313 14:33:40.153588 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s69" event={"ID":"0528f01c-62c6-4665-9b64-b20182ed6aad","Type":"ContainerStarted","Data":"3e76bb401490d4fcb76c9945e01e87306f712102e853a1d3e262b2dbb4c6cc18"} Mar 13 14:33:41 crc kubenswrapper[4898]: I0313 14:33:41.172016 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s69" event={"ID":"0528f01c-62c6-4665-9b64-b20182ed6aad","Type":"ContainerStarted","Data":"68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36"} Mar 13 14:33:43 crc kubenswrapper[4898]: I0313 14:33:43.205435 4898 generic.go:334] "Generic (PLEG): container finished" podID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerID="68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36" exitCode=0 Mar 13 14:33:43 crc kubenswrapper[4898]: I0313 14:33:43.205498 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s69" event={"ID":"0528f01c-62c6-4665-9b64-b20182ed6aad","Type":"ContainerDied","Data":"68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36"} Mar 13 14:33:44 crc kubenswrapper[4898]: I0313 14:33:44.220415 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s69" event={"ID":"0528f01c-62c6-4665-9b64-b20182ed6aad","Type":"ContainerStarted","Data":"77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15"} Mar 13 14:33:44 crc kubenswrapper[4898]: I0313 14:33:44.246786 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-88s69" podStartSLOduration=2.523557377 podStartE2EDuration="6.246767749s" podCreationTimestamp="2026-03-13 14:33:38 +0000 UTC" firstStartedPulling="2026-03-13 14:33:40.154957014 +0000 UTC m=+2255.156545253" lastFinishedPulling="2026-03-13 14:33:43.878167376 +0000 UTC m=+2258.879755625" observedRunningTime="2026-03-13 14:33:44.240133202 +0000 UTC m=+2259.241721461" watchObservedRunningTime="2026-03-13 14:33:44.246767749 +0000 UTC m=+2259.248355998" Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.134177 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.134752 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.134802 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.135745 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.135804 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" gracePeriod=600 Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.168316 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.168491 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:49 crc kubenswrapper[4898]: E0313 14:33:49.270534 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.282347 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" exitCode=0 Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.282338 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db"} Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.282432 4898 scope.go:117] "RemoveContainer" containerID="544b53b5cdd0293005863b343628de53b83869ce5cd2c798b19c01abba2b5bc8" Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.283255 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:33:49 crc kubenswrapper[4898]: E0313 14:33:49.283684 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.863700 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mt6t6" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="registry-server" probeResult="failure" output=< Mar 13 14:33:49 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:33:49 crc kubenswrapper[4898]: > Mar 13 14:33:50 crc kubenswrapper[4898]: I0313 14:33:50.234473 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-88s69" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="registry-server" probeResult="failure" output=< Mar 13 14:33:50 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:33:50 crc kubenswrapper[4898]: > Mar 13 14:33:59 crc kubenswrapper[4898]: I0313 14:33:59.243654 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:59 crc kubenswrapper[4898]: I0313 14:33:59.307784 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:59 crc kubenswrapper[4898]: I0313 14:33:59.640803 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-88s69"] Mar 13 14:33:59 crc kubenswrapper[4898]: I0313 14:33:59.828297 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mt6t6" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="registry-server" probeResult="failure" output=< Mar 13 14:33:59 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:33:59 crc kubenswrapper[4898]: > Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.139671 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556874-5w2n9"] Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.141853 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.144128 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.145303 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.154890 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.164842 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556874-5w2n9"] Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.221724 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9g5h\" (UniqueName: \"kubernetes.io/projected/eb068c44-8492-4ed4-973b-f1233d9db645-kube-api-access-w9g5h\") pod \"auto-csr-approver-29556874-5w2n9\" (UID: \"eb068c44-8492-4ed4-973b-f1233d9db645\") " pod="openshift-infra/auto-csr-approver-29556874-5w2n9" Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.323929 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9g5h\" (UniqueName: \"kubernetes.io/projected/eb068c44-8492-4ed4-973b-f1233d9db645-kube-api-access-w9g5h\") pod \"auto-csr-approver-29556874-5w2n9\" (UID: \"eb068c44-8492-4ed4-973b-f1233d9db645\") " pod="openshift-infra/auto-csr-approver-29556874-5w2n9" Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.353789 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9g5h\" (UniqueName: \"kubernetes.io/projected/eb068c44-8492-4ed4-973b-f1233d9db645-kube-api-access-w9g5h\") pod \"auto-csr-approver-29556874-5w2n9\" (UID: \"eb068c44-8492-4ed4-973b-f1233d9db645\") " pod="openshift-infra/auto-csr-approver-29556874-5w2n9" Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.421106 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-88s69" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="registry-server" containerID="cri-o://77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15" gracePeriod=2 Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.467824 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.739410 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:34:00 crc kubenswrapper[4898]: E0313 14:34:00.739999 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.026217 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:34:01 crc kubenswrapper[4898]: W0313 14:34:01.034040 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb068c44_8492_4ed4_973b_f1233d9db645.slice/crio-800944d00480e2fc3b6f3f4e98fb3f11048a2d231376dd91c90d61f1dde663db WatchSource:0}: Error finding container 800944d00480e2fc3b6f3f4e98fb3f11048a2d231376dd91c90d61f1dde663db: Status 404 returned error can't find the container with id 800944d00480e2fc3b6f3f4e98fb3f11048a2d231376dd91c90d61f1dde663db Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.041082 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556874-5w2n9"] Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.146593 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-utilities\") pod \"0528f01c-62c6-4665-9b64-b20182ed6aad\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.146880 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-catalog-content\") pod \"0528f01c-62c6-4665-9b64-b20182ed6aad\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.146962 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncrlc\" (UniqueName: \"kubernetes.io/projected/0528f01c-62c6-4665-9b64-b20182ed6aad-kube-api-access-ncrlc\") pod \"0528f01c-62c6-4665-9b64-b20182ed6aad\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.149165 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-utilities" (OuterVolumeSpecName: "utilities") pod "0528f01c-62c6-4665-9b64-b20182ed6aad" (UID: "0528f01c-62c6-4665-9b64-b20182ed6aad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.154865 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0528f01c-62c6-4665-9b64-b20182ed6aad-kube-api-access-ncrlc" (OuterVolumeSpecName: "kube-api-access-ncrlc") pod "0528f01c-62c6-4665-9b64-b20182ed6aad" (UID: "0528f01c-62c6-4665-9b64-b20182ed6aad"). InnerVolumeSpecName "kube-api-access-ncrlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.175256 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0528f01c-62c6-4665-9b64-b20182ed6aad" (UID: "0528f01c-62c6-4665-9b64-b20182ed6aad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.249971 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.250025 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncrlc\" (UniqueName: \"kubernetes.io/projected/0528f01c-62c6-4665-9b64-b20182ed6aad-kube-api-access-ncrlc\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.250049 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.435226 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" event={"ID":"eb068c44-8492-4ed4-973b-f1233d9db645","Type":"ContainerStarted","Data":"800944d00480e2fc3b6f3f4e98fb3f11048a2d231376dd91c90d61f1dde663db"} Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.441496 4898 generic.go:334] "Generic (PLEG): container finished" podID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerID="77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15" exitCode=0 Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.441627 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s69" event={"ID":"0528f01c-62c6-4665-9b64-b20182ed6aad","Type":"ContainerDied","Data":"77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15"} Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.441657 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.441687 4898 scope.go:117] "RemoveContainer" containerID="77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.441673 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s69" event={"ID":"0528f01c-62c6-4665-9b64-b20182ed6aad","Type":"ContainerDied","Data":"3e76bb401490d4fcb76c9945e01e87306f712102e853a1d3e262b2dbb4c6cc18"} Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.471641 4898 scope.go:117] "RemoveContainer" containerID="68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.501620 4898 scope.go:117] "RemoveContainer" containerID="1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.501851 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-88s69"] Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.524495 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-88s69"] Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.567545 4898 scope.go:117] "RemoveContainer" containerID="77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15" Mar 13 14:34:01 crc kubenswrapper[4898]: E0313 14:34:01.568228 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15\": container with ID starting with 77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15 not found: ID does not exist" containerID="77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.568364 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15"} err="failed to get container status \"77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15\": rpc error: code = NotFound desc = could not find container \"77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15\": container with ID starting with 77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15 not found: ID does not exist" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.568408 4898 scope.go:117] "RemoveContainer" containerID="68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36" Mar 13 14:34:01 crc kubenswrapper[4898]: E0313 14:34:01.568755 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36\": container with ID starting with 68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36 not found: ID does not exist" containerID="68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.568803 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36"} err="failed to get container status \"68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36\": rpc error: code = NotFound desc = could not find container \"68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36\": container with ID starting with 68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36 not found: ID does not exist" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.568830 4898 scope.go:117] "RemoveContainer" containerID="1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81" Mar 13 14:34:01 crc kubenswrapper[4898]: E0313 14:34:01.569160 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81\": container with ID starting with 1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81 not found: ID does not exist" containerID="1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.569204 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81"} err="failed to get container status \"1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81\": rpc error: code = NotFound desc = could not find container \"1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81\": container with ID starting with 1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81 not found: ID does not exist" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.754272 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" path="/var/lib/kubelet/pods/0528f01c-62c6-4665-9b64-b20182ed6aad/volumes" Mar 13 14:34:02 crc kubenswrapper[4898]: I0313 14:34:02.453928 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" event={"ID":"eb068c44-8492-4ed4-973b-f1233d9db645","Type":"ContainerStarted","Data":"408d12ea5e972e9b868ac62eb57c8cf1a207c59938c5862250b310c3d0d4947f"} Mar 13 14:34:02 crc kubenswrapper[4898]: I0313 14:34:02.475716 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" podStartSLOduration=1.441785961 podStartE2EDuration="2.475691867s" podCreationTimestamp="2026-03-13 14:34:00 +0000 UTC" firstStartedPulling="2026-03-13 14:34:01.036457092 +0000 UTC m=+2276.038045331" lastFinishedPulling="2026-03-13 14:34:02.070362998 +0000 UTC m=+2277.071951237" observedRunningTime="2026-03-13 14:34:02.471810099 +0000 UTC m=+2277.473398378" watchObservedRunningTime="2026-03-13 14:34:02.475691867 +0000 UTC m=+2277.477280136" Mar 13 14:34:03 crc kubenswrapper[4898]: I0313 14:34:03.469576 4898 generic.go:334] "Generic (PLEG): container finished" podID="eb068c44-8492-4ed4-973b-f1233d9db645" containerID="408d12ea5e972e9b868ac62eb57c8cf1a207c59938c5862250b310c3d0d4947f" exitCode=0 Mar 13 14:34:03 crc kubenswrapper[4898]: I0313 14:34:03.470167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" event={"ID":"eb068c44-8492-4ed4-973b-f1233d9db645","Type":"ContainerDied","Data":"408d12ea5e972e9b868ac62eb57c8cf1a207c59938c5862250b310c3d0d4947f"} Mar 13 14:34:04 crc kubenswrapper[4898]: I0313 14:34:04.961083 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.051134 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9g5h\" (UniqueName: \"kubernetes.io/projected/eb068c44-8492-4ed4-973b-f1233d9db645-kube-api-access-w9g5h\") pod \"eb068c44-8492-4ed4-973b-f1233d9db645\" (UID: \"eb068c44-8492-4ed4-973b-f1233d9db645\") " Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.077063 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb068c44-8492-4ed4-973b-f1233d9db645-kube-api-access-w9g5h" (OuterVolumeSpecName: "kube-api-access-w9g5h") pod "eb068c44-8492-4ed4-973b-f1233d9db645" (UID: "eb068c44-8492-4ed4-973b-f1233d9db645"). InnerVolumeSpecName "kube-api-access-w9g5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.086364 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-tqzkv"] Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.105466 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-tqzkv"] Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.158173 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9g5h\" (UniqueName: \"kubernetes.io/projected/eb068c44-8492-4ed4-973b-f1233d9db645-kube-api-access-w9g5h\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.500194 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" event={"ID":"eb068c44-8492-4ed4-973b-f1233d9db645","Type":"ContainerDied","Data":"800944d00480e2fc3b6f3f4e98fb3f11048a2d231376dd91c90d61f1dde663db"} Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.500232 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="800944d00480e2fc3b6f3f4e98fb3f11048a2d231376dd91c90d61f1dde663db" Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.500309 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.757545 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" path="/var/lib/kubelet/pods/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53/volumes" Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.836126 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556868-9gt46"] Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.847735 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556868-9gt46"] Mar 13 14:34:07 crc kubenswrapper[4898]: I0313 14:34:07.766353 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9565fbbb-2765-4ffb-a934-e5ddf9be1d17" path="/var/lib/kubelet/pods/9565fbbb-2765-4ffb-a934-e5ddf9be1d17/volumes" Mar 13 14:34:08 crc kubenswrapper[4898]: I0313 14:34:08.831907 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:34:08 crc kubenswrapper[4898]: I0313 14:34:08.904068 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:34:10 crc kubenswrapper[4898]: I0313 14:34:10.010238 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mt6t6"] Mar 13 14:34:10 crc kubenswrapper[4898]: I0313 14:34:10.563081 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mt6t6" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="registry-server" containerID="cri-o://14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f" gracePeriod=2 Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.122215 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.225064 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-utilities\") pod \"5a1116c6-c423-4585-af50-c9ecdca3720e\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.225114 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68xrp\" (UniqueName: \"kubernetes.io/projected/5a1116c6-c423-4585-af50-c9ecdca3720e-kube-api-access-68xrp\") pod \"5a1116c6-c423-4585-af50-c9ecdca3720e\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.225198 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-catalog-content\") pod \"5a1116c6-c423-4585-af50-c9ecdca3720e\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.226229 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-utilities" (OuterVolumeSpecName: "utilities") pod "5a1116c6-c423-4585-af50-c9ecdca3720e" (UID: "5a1116c6-c423-4585-af50-c9ecdca3720e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.234041 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1116c6-c423-4585-af50-c9ecdca3720e-kube-api-access-68xrp" (OuterVolumeSpecName: "kube-api-access-68xrp") pod "5a1116c6-c423-4585-af50-c9ecdca3720e" (UID: "5a1116c6-c423-4585-af50-c9ecdca3720e"). InnerVolumeSpecName "kube-api-access-68xrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.329082 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.329139 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68xrp\" (UniqueName: \"kubernetes.io/projected/5a1116c6-c423-4585-af50-c9ecdca3720e-kube-api-access-68xrp\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.353133 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a1116c6-c423-4585-af50-c9ecdca3720e" (UID: "5a1116c6-c423-4585-af50-c9ecdca3720e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.431653 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.575126 4898 generic.go:334] "Generic (PLEG): container finished" podID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerID="14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f" exitCode=0 Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.575182 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.575186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt6t6" event={"ID":"5a1116c6-c423-4585-af50-c9ecdca3720e","Type":"ContainerDied","Data":"14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f"} Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.575268 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt6t6" event={"ID":"5a1116c6-c423-4585-af50-c9ecdca3720e","Type":"ContainerDied","Data":"b30638434c0fe439393ecdc839cda22c240c59580f70c0f1734ebb6f4ce66486"} Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.575300 4898 scope.go:117] "RemoveContainer" containerID="14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.603411 4898 scope.go:117] "RemoveContainer" containerID="7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.655177 4898 scope.go:117] "RemoveContainer" containerID="affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.661393 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mt6t6"] Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.671726 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mt6t6"] Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.713718 4898 scope.go:117] "RemoveContainer" containerID="14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f" Mar 13 14:34:11 crc kubenswrapper[4898]: E0313 14:34:11.714226 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f\": container with ID starting with 14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f not found: ID does not exist" containerID="14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.714256 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f"} err="failed to get container status \"14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f\": rpc error: code = NotFound desc = could not find container \"14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f\": container with ID starting with 14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f not found: ID does not exist" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.714287 4898 scope.go:117] "RemoveContainer" containerID="7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916" Mar 13 14:34:11 crc kubenswrapper[4898]: E0313 14:34:11.714614 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916\": container with ID starting with 7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916 not found: ID does not exist" containerID="7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.714667 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916"} err="failed to get container status \"7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916\": rpc error: code = NotFound desc = could not find container \"7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916\": container with ID starting with 7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916 not found: ID does not exist" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.714705 4898 scope.go:117] "RemoveContainer" containerID="affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210" Mar 13 14:34:11 crc kubenswrapper[4898]: E0313 14:34:11.715116 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210\": container with ID starting with affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210 not found: ID does not exist" containerID="affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.715176 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210"} err="failed to get container status \"affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210\": rpc error: code = NotFound desc = could not find container \"affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210\": container with ID starting with affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210 not found: ID does not exist" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.751270 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" path="/var/lib/kubelet/pods/5a1116c6-c423-4585-af50-c9ecdca3720e/volumes" Mar 13 14:34:12 crc kubenswrapper[4898]: I0313 14:34:12.740427 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:34:12 crc kubenswrapper[4898]: E0313 14:34:12.741134 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:34:24 crc kubenswrapper[4898]: I0313 14:34:24.764262 4898 generic.go:334] "Generic (PLEG): container finished" podID="ac094822-6272-4730-ab0b-16f0116426b5" containerID="75351a338e412d11bdcadf255cb40c23212372a818b23c31d819578fbd7526fe" exitCode=0 Mar 13 14:34:24 crc kubenswrapper[4898]: I0313 14:34:24.765219 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" event={"ID":"ac094822-6272-4730-ab0b-16f0116426b5","Type":"ContainerDied","Data":"75351a338e412d11bdcadf255cb40c23212372a818b23c31d819578fbd7526fe"} Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.386982 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.479810 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-inventory\") pod \"ac094822-6272-4730-ab0b-16f0116426b5\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.479992 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-ssh-key-openstack-edpm-ipam\") pod \"ac094822-6272-4730-ab0b-16f0116426b5\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.480156 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsqr5\" (UniqueName: \"kubernetes.io/projected/ac094822-6272-4730-ab0b-16f0116426b5-kube-api-access-zsqr5\") pod \"ac094822-6272-4730-ab0b-16f0116426b5\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.487163 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac094822-6272-4730-ab0b-16f0116426b5-kube-api-access-zsqr5" (OuterVolumeSpecName: "kube-api-access-zsqr5") pod "ac094822-6272-4730-ab0b-16f0116426b5" (UID: "ac094822-6272-4730-ab0b-16f0116426b5"). InnerVolumeSpecName "kube-api-access-zsqr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.519476 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-inventory" (OuterVolumeSpecName: "inventory") pod "ac094822-6272-4730-ab0b-16f0116426b5" (UID: "ac094822-6272-4730-ab0b-16f0116426b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.522761 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ac094822-6272-4730-ab0b-16f0116426b5" (UID: "ac094822-6272-4730-ab0b-16f0116426b5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.583346 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsqr5\" (UniqueName: \"kubernetes.io/projected/ac094822-6272-4730-ab0b-16f0116426b5-kube-api-access-zsqr5\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.583397 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.583410 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.740317 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:34:26 crc kubenswrapper[4898]: E0313 14:34:26.740651 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.794090 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" event={"ID":"ac094822-6272-4730-ab0b-16f0116426b5","Type":"ContainerDied","Data":"704f436dc38746438fb453814824552f6f864e1b290a23b87caa2e264e34538d"} Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.794150 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="704f436dc38746438fb453814824552f6f864e1b290a23b87caa2e264e34538d" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.794295 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.020955 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x8wvs"] Mar 13 14:34:27 crc kubenswrapper[4898]: E0313 14:34:27.021600 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="extract-content" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.021623 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="extract-content" Mar 13 14:34:27 crc kubenswrapper[4898]: E0313 14:34:27.021654 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="registry-server" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.021662 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="registry-server" Mar 13 14:34:27 crc kubenswrapper[4898]: E0313 14:34:27.021679 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="extract-utilities" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.021687 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="extract-utilities" Mar 13 14:34:27 crc kubenswrapper[4898]: E0313 14:34:27.021712 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="extract-content" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.021720 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="extract-content" Mar 13 14:34:27 crc kubenswrapper[4898]: E0313 14:34:27.021737 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb068c44-8492-4ed4-973b-f1233d9db645" containerName="oc" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.021744 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb068c44-8492-4ed4-973b-f1233d9db645" containerName="oc" Mar 13 14:34:27 crc kubenswrapper[4898]: E0313 14:34:27.021766 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="registry-server" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.021773 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="registry-server" Mar 13 14:34:27 crc kubenswrapper[4898]: E0313 14:34:27.021793 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac094822-6272-4730-ab0b-16f0116426b5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.021802 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac094822-6272-4730-ab0b-16f0116426b5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:34:27 crc kubenswrapper[4898]: E0313 14:34:27.021828 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="extract-utilities" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.021836 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="extract-utilities" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.022108 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="registry-server" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.022127 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb068c44-8492-4ed4-973b-f1233d9db645" containerName="oc" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.022141 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac094822-6272-4730-ab0b-16f0116426b5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.022163 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="registry-server" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.023261 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.025467 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.027211 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.028307 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.029447 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.038991 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x8wvs"] Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.097482 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.098230 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkklg\" (UniqueName: \"kubernetes.io/projected/e7c70549-1fc7-42c2-8c81-075c611671ae-kube-api-access-nkklg\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.098582 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.200969 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.201174 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.201359 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkklg\" (UniqueName: \"kubernetes.io/projected/e7c70549-1fc7-42c2-8c81-075c611671ae-kube-api-access-nkklg\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.206503 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.207399 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.225630 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkklg\" (UniqueName: \"kubernetes.io/projected/e7c70549-1fc7-42c2-8c81-075c611671ae-kube-api-access-nkklg\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.362668 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:28 crc kubenswrapper[4898]: I0313 14:34:28.014130 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x8wvs"] Mar 13 14:34:28 crc kubenswrapper[4898]: I0313 14:34:28.820372 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" event={"ID":"e7c70549-1fc7-42c2-8c81-075c611671ae","Type":"ContainerStarted","Data":"e0e9da157b1c32a35018510698432124395ad27481524dc7ff26e350853d5be5"} Mar 13 14:34:29 crc kubenswrapper[4898]: I0313 14:34:29.860843 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" event={"ID":"e7c70549-1fc7-42c2-8c81-075c611671ae","Type":"ContainerStarted","Data":"b3c420da5c21ae8a4f33923e8d7d1d38ba37569fd225b705660bb76182968d0f"} Mar 13 14:34:29 crc kubenswrapper[4898]: I0313 14:34:29.899158 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" podStartSLOduration=3.339618398 podStartE2EDuration="3.89912204s" podCreationTimestamp="2026-03-13 14:34:26 +0000 UTC" firstStartedPulling="2026-03-13 14:34:28.031112841 +0000 UTC m=+2303.032701070" lastFinishedPulling="2026-03-13 14:34:28.590616433 +0000 UTC m=+2303.592204712" observedRunningTime="2026-03-13 14:34:29.883480895 +0000 UTC m=+2304.885069174" watchObservedRunningTime="2026-03-13 14:34:29.89912204 +0000 UTC m=+2304.900710319" Mar 13 14:34:35 crc kubenswrapper[4898]: I0313 14:34:35.946121 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7c70549-1fc7-42c2-8c81-075c611671ae" containerID="b3c420da5c21ae8a4f33923e8d7d1d38ba37569fd225b705660bb76182968d0f" exitCode=0 Mar 13 14:34:35 crc kubenswrapper[4898]: I0313 14:34:35.946231 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" event={"ID":"e7c70549-1fc7-42c2-8c81-075c611671ae","Type":"ContainerDied","Data":"b3c420da5c21ae8a4f33923e8d7d1d38ba37569fd225b705660bb76182968d0f"} Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.572679 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.723371 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-inventory-0\") pod \"e7c70549-1fc7-42c2-8c81-075c611671ae\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.723525 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkklg\" (UniqueName: \"kubernetes.io/projected/e7c70549-1fc7-42c2-8c81-075c611671ae-kube-api-access-nkklg\") pod \"e7c70549-1fc7-42c2-8c81-075c611671ae\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.723702 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-ssh-key-openstack-edpm-ipam\") pod \"e7c70549-1fc7-42c2-8c81-075c611671ae\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.729659 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c70549-1fc7-42c2-8c81-075c611671ae-kube-api-access-nkklg" (OuterVolumeSpecName: "kube-api-access-nkklg") pod "e7c70549-1fc7-42c2-8c81-075c611671ae" (UID: "e7c70549-1fc7-42c2-8c81-075c611671ae"). InnerVolumeSpecName "kube-api-access-nkklg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.743827 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:34:37 crc kubenswrapper[4898]: E0313 14:34:37.745002 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.757238 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e7c70549-1fc7-42c2-8c81-075c611671ae" (UID: "e7c70549-1fc7-42c2-8c81-075c611671ae"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.783215 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "e7c70549-1fc7-42c2-8c81-075c611671ae" (UID: "e7c70549-1fc7-42c2-8c81-075c611671ae"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.826610 4898 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.826645 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkklg\" (UniqueName: \"kubernetes.io/projected/e7c70549-1fc7-42c2-8c81-075c611671ae-kube-api-access-nkklg\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.826658 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.972459 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" event={"ID":"e7c70549-1fc7-42c2-8c81-075c611671ae","Type":"ContainerDied","Data":"e0e9da157b1c32a35018510698432124395ad27481524dc7ff26e350853d5be5"} Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.972497 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0e9da157b1c32a35018510698432124395ad27481524dc7ff26e350853d5be5" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.972556 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.081470 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b"] Mar 13 14:34:38 crc kubenswrapper[4898]: E0313 14:34:38.082448 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c70549-1fc7-42c2-8c81-075c611671ae" containerName="ssh-known-hosts-edpm-deployment" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.082491 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c70549-1fc7-42c2-8c81-075c611671ae" containerName="ssh-known-hosts-edpm-deployment" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.083092 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c70549-1fc7-42c2-8c81-075c611671ae" containerName="ssh-known-hosts-edpm-deployment" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.085047 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.087685 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.087684 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.087807 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.088824 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.113846 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b"] Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.235703 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bh28\" (UniqueName: \"kubernetes.io/projected/05d3f0e4-c029-4e2f-a3c1-471faa671767-kube-api-access-7bh28\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.235869 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.235922 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.338795 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.338843 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.339027 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bh28\" (UniqueName: \"kubernetes.io/projected/05d3f0e4-c029-4e2f-a3c1-471faa671767-kube-api-access-7bh28\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.346666 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.353870 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.369638 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bh28\" (UniqueName: \"kubernetes.io/projected/05d3f0e4-c029-4e2f-a3c1-471faa671767-kube-api-access-7bh28\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.417478 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:39 crc kubenswrapper[4898]: I0313 14:34:39.032016 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b"] Mar 13 14:34:39 crc kubenswrapper[4898]: W0313 14:34:39.037035 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05d3f0e4_c029_4e2f_a3c1_471faa671767.slice/crio-c31aab1886b7f3a89065d0dc4d79b7428ae41e2dc20698018084260c555bce99 WatchSource:0}: Error finding container c31aab1886b7f3a89065d0dc4d79b7428ae41e2dc20698018084260c555bce99: Status 404 returned error can't find the container with id c31aab1886b7f3a89065d0dc4d79b7428ae41e2dc20698018084260c555bce99 Mar 13 14:34:40 crc kubenswrapper[4898]: I0313 14:34:40.000320 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" event={"ID":"05d3f0e4-c029-4e2f-a3c1-471faa671767","Type":"ContainerStarted","Data":"ed700be5cb65d6fed4e57657125463e8261e0c0244d561f570e062a4cc5f6b21"} Mar 13 14:34:40 crc kubenswrapper[4898]: I0313 14:34:40.000789 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" event={"ID":"05d3f0e4-c029-4e2f-a3c1-471faa671767","Type":"ContainerStarted","Data":"c31aab1886b7f3a89065d0dc4d79b7428ae41e2dc20698018084260c555bce99"} Mar 13 14:34:40 crc kubenswrapper[4898]: I0313 14:34:40.029643 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" podStartSLOduration=1.492955453 podStartE2EDuration="2.029625198s" podCreationTimestamp="2026-03-13 14:34:38 +0000 UTC" firstStartedPulling="2026-03-13 14:34:39.041093748 +0000 UTC m=+2314.042681997" lastFinishedPulling="2026-03-13 14:34:39.577763473 +0000 UTC m=+2314.579351742" observedRunningTime="2026-03-13 14:34:40.019330218 +0000 UTC m=+2315.020918497" watchObservedRunningTime="2026-03-13 14:34:40.029625198 +0000 UTC m=+2315.031213447" Mar 13 14:34:40 crc kubenswrapper[4898]: I0313 14:34:40.115151 4898 scope.go:117] "RemoveContainer" containerID="390b3ed23857cd84f527a8c8b365a228b6c0b2caebb2767f64baba810ca56690" Mar 13 14:34:40 crc kubenswrapper[4898]: I0313 14:34:40.163120 4898 scope.go:117] "RemoveContainer" containerID="23e456f4a6227ca0f6e6f99f4c35a21b09d57519ec2a733d94a113420fb1a340" Mar 13 14:34:48 crc kubenswrapper[4898]: E0313 14:34:48.293282 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05d3f0e4_c029_4e2f_a3c1_471faa671767.slice/crio-conmon-ed700be5cb65d6fed4e57657125463e8261e0c0244d561f570e062a4cc5f6b21.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05d3f0e4_c029_4e2f_a3c1_471faa671767.slice/crio-ed700be5cb65d6fed4e57657125463e8261e0c0244d561f570e062a4cc5f6b21.scope\": RecentStats: unable to find data in memory cache]" Mar 13 14:34:48 crc kubenswrapper[4898]: E0313 14:34:48.293743 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05d3f0e4_c029_4e2f_a3c1_471faa671767.slice/crio-ed700be5cb65d6fed4e57657125463e8261e0c0244d561f570e062a4cc5f6b21.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05d3f0e4_c029_4e2f_a3c1_471faa671767.slice/crio-conmon-ed700be5cb65d6fed4e57657125463e8261e0c0244d561f570e062a4cc5f6b21.scope\": RecentStats: unable to find data in memory cache]" Mar 13 14:34:49 crc kubenswrapper[4898]: I0313 14:34:49.181602 4898 generic.go:334] "Generic (PLEG): container finished" podID="05d3f0e4-c029-4e2f-a3c1-471faa671767" containerID="ed700be5cb65d6fed4e57657125463e8261e0c0244d561f570e062a4cc5f6b21" exitCode=0 Mar 13 14:34:49 crc kubenswrapper[4898]: I0313 14:34:49.181658 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" event={"ID":"05d3f0e4-c029-4e2f-a3c1-471faa671767","Type":"ContainerDied","Data":"ed700be5cb65d6fed4e57657125463e8261e0c0244d561f570e062a4cc5f6b21"} Mar 13 14:34:50 crc kubenswrapper[4898]: I0313 14:34:50.820005 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:50 crc kubenswrapper[4898]: I0313 14:34:50.934708 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-ssh-key-openstack-edpm-ipam\") pod \"05d3f0e4-c029-4e2f-a3c1-471faa671767\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " Mar 13 14:34:50 crc kubenswrapper[4898]: I0313 14:34:50.934934 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bh28\" (UniqueName: \"kubernetes.io/projected/05d3f0e4-c029-4e2f-a3c1-471faa671767-kube-api-access-7bh28\") pod \"05d3f0e4-c029-4e2f-a3c1-471faa671767\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " Mar 13 14:34:50 crc kubenswrapper[4898]: I0313 14:34:50.935235 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-inventory\") pod \"05d3f0e4-c029-4e2f-a3c1-471faa671767\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " Mar 13 14:34:50 crc kubenswrapper[4898]: I0313 14:34:50.942554 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05d3f0e4-c029-4e2f-a3c1-471faa671767-kube-api-access-7bh28" (OuterVolumeSpecName: "kube-api-access-7bh28") pod "05d3f0e4-c029-4e2f-a3c1-471faa671767" (UID: "05d3f0e4-c029-4e2f-a3c1-471faa671767"). InnerVolumeSpecName "kube-api-access-7bh28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:34:50 crc kubenswrapper[4898]: I0313 14:34:50.983123 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-inventory" (OuterVolumeSpecName: "inventory") pod "05d3f0e4-c029-4e2f-a3c1-471faa671767" (UID: "05d3f0e4-c029-4e2f-a3c1-471faa671767"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:34:50 crc kubenswrapper[4898]: I0313 14:34:50.996370 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "05d3f0e4-c029-4e2f-a3c1-471faa671767" (UID: "05d3f0e4-c029-4e2f-a3c1-471faa671767"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.041020 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.041453 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.041481 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bh28\" (UniqueName: \"kubernetes.io/projected/05d3f0e4-c029-4e2f-a3c1-471faa671767-kube-api-access-7bh28\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.214342 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" event={"ID":"05d3f0e4-c029-4e2f-a3c1-471faa671767","Type":"ContainerDied","Data":"c31aab1886b7f3a89065d0dc4d79b7428ae41e2dc20698018084260c555bce99"} Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.214406 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c31aab1886b7f3a89065d0dc4d79b7428ae41e2dc20698018084260c555bce99" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.214437 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.306774 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2"] Mar 13 14:34:51 crc kubenswrapper[4898]: E0313 14:34:51.307758 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05d3f0e4-c029-4e2f-a3c1-471faa671767" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.307790 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="05d3f0e4-c029-4e2f-a3c1-471faa671767" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.308233 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="05d3f0e4-c029-4e2f-a3c1-471faa671767" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.310101 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.312587 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.312742 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.312871 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.313435 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.335404 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2"] Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.455992 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.456120 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbvxh\" (UniqueName: \"kubernetes.io/projected/8a674c4a-b209-4ea0-83b0-c46f820a81ef-kube-api-access-lbvxh\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.456260 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.558594 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.558755 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.558821 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbvxh\" (UniqueName: \"kubernetes.io/projected/8a674c4a-b209-4ea0-83b0-c46f820a81ef-kube-api-access-lbvxh\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.565321 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.566325 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.587644 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbvxh\" (UniqueName: \"kubernetes.io/projected/8a674c4a-b209-4ea0-83b0-c46f820a81ef-kube-api-access-lbvxh\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.634590 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.740074 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:34:51 crc kubenswrapper[4898]: E0313 14:34:51.740644 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:34:52 crc kubenswrapper[4898]: I0313 14:34:52.340353 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2"] Mar 13 14:34:52 crc kubenswrapper[4898]: W0313 14:34:52.346031 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a674c4a_b209_4ea0_83b0_c46f820a81ef.slice/crio-f675076d96f0d80dec8b437957588d6a2cd269730bfb83e736cbc1f9dc1093c1 WatchSource:0}: Error finding container f675076d96f0d80dec8b437957588d6a2cd269730bfb83e736cbc1f9dc1093c1: Status 404 returned error can't find the container with id f675076d96f0d80dec8b437957588d6a2cd269730bfb83e736cbc1f9dc1093c1 Mar 13 14:34:53 crc kubenswrapper[4898]: I0313 14:34:53.240008 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" event={"ID":"8a674c4a-b209-4ea0-83b0-c46f820a81ef","Type":"ContainerStarted","Data":"a6752dd3dce237a9a2a6569325a7b2fac6103bbdc3acf7b6791bf41cab42bec9"} Mar 13 14:34:53 crc kubenswrapper[4898]: I0313 14:34:53.240733 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" event={"ID":"8a674c4a-b209-4ea0-83b0-c46f820a81ef","Type":"ContainerStarted","Data":"f675076d96f0d80dec8b437957588d6a2cd269730bfb83e736cbc1f9dc1093c1"} Mar 13 14:34:53 crc kubenswrapper[4898]: I0313 14:34:53.267939 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" podStartSLOduration=1.75544904 podStartE2EDuration="2.267917875s" podCreationTimestamp="2026-03-13 14:34:51 +0000 UTC" firstStartedPulling="2026-03-13 14:34:52.351233258 +0000 UTC m=+2327.352821507" lastFinishedPulling="2026-03-13 14:34:52.863702073 +0000 UTC m=+2327.865290342" observedRunningTime="2026-03-13 14:34:53.258273872 +0000 UTC m=+2328.259862121" watchObservedRunningTime="2026-03-13 14:34:53.267917875 +0000 UTC m=+2328.269506114" Mar 13 14:35:02 crc kubenswrapper[4898]: I0313 14:35:02.771153 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pn2m4"] Mar 13 14:35:02 crc kubenswrapper[4898]: I0313 14:35:02.786370 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:02 crc kubenswrapper[4898]: I0313 14:35:02.800366 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pn2m4"] Mar 13 14:35:02 crc kubenswrapper[4898]: I0313 14:35:02.906800 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx6lb\" (UniqueName: \"kubernetes.io/projected/f6bb9c39-7999-48d1-9223-d7408aa31f47-kube-api-access-kx6lb\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:02 crc kubenswrapper[4898]: I0313 14:35:02.910721 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-utilities\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:02 crc kubenswrapper[4898]: I0313 14:35:02.911550 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-catalog-content\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.015973 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-catalog-content\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.016327 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6lb\" (UniqueName: \"kubernetes.io/projected/f6bb9c39-7999-48d1-9223-d7408aa31f47-kube-api-access-kx6lb\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.016642 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-utilities\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.016508 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-catalog-content\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.017213 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-utilities\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.047725 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx6lb\" (UniqueName: \"kubernetes.io/projected/f6bb9c39-7999-48d1-9223-d7408aa31f47-kube-api-access-kx6lb\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.125249 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.406062 4898 generic.go:334] "Generic (PLEG): container finished" podID="8a674c4a-b209-4ea0-83b0-c46f820a81ef" containerID="a6752dd3dce237a9a2a6569325a7b2fac6103bbdc3acf7b6791bf41cab42bec9" exitCode=0 Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.406246 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" event={"ID":"8a674c4a-b209-4ea0-83b0-c46f820a81ef","Type":"ContainerDied","Data":"a6752dd3dce237a9a2a6569325a7b2fac6103bbdc3acf7b6791bf41cab42bec9"} Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.692950 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pn2m4"] Mar 13 14:35:04 crc kubenswrapper[4898]: I0313 14:35:04.418955 4898 generic.go:334] "Generic (PLEG): container finished" podID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerID="6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73" exitCode=0 Mar 13 14:35:04 crc kubenswrapper[4898]: I0313 14:35:04.419412 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn2m4" event={"ID":"f6bb9c39-7999-48d1-9223-d7408aa31f47","Type":"ContainerDied","Data":"6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73"} Mar 13 14:35:04 crc kubenswrapper[4898]: I0313 14:35:04.419444 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn2m4" event={"ID":"f6bb9c39-7999-48d1-9223-d7408aa31f47","Type":"ContainerStarted","Data":"bfcd0705440e9b0ea6531f51eb75c71426d7ce4348588ec527180e05d3f093f4"} Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.161749 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.197854 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbvxh\" (UniqueName: \"kubernetes.io/projected/8a674c4a-b209-4ea0-83b0-c46f820a81ef-kube-api-access-lbvxh\") pod \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.198098 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-inventory\") pod \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.198318 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-ssh-key-openstack-edpm-ipam\") pod \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.229413 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a674c4a-b209-4ea0-83b0-c46f820a81ef-kube-api-access-lbvxh" (OuterVolumeSpecName: "kube-api-access-lbvxh") pod "8a674c4a-b209-4ea0-83b0-c46f820a81ef" (UID: "8a674c4a-b209-4ea0-83b0-c46f820a81ef"). InnerVolumeSpecName "kube-api-access-lbvxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.260408 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-inventory" (OuterVolumeSpecName: "inventory") pod "8a674c4a-b209-4ea0-83b0-c46f820a81ef" (UID: "8a674c4a-b209-4ea0-83b0-c46f820a81ef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.304325 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbvxh\" (UniqueName: \"kubernetes.io/projected/8a674c4a-b209-4ea0-83b0-c46f820a81ef-kube-api-access-lbvxh\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.304359 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.348008 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8a674c4a-b209-4ea0-83b0-c46f820a81ef" (UID: "8a674c4a-b209-4ea0-83b0-c46f820a81ef"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.407070 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.441174 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" event={"ID":"8a674c4a-b209-4ea0-83b0-c46f820a81ef","Type":"ContainerDied","Data":"f675076d96f0d80dec8b437957588d6a2cd269730bfb83e736cbc1f9dc1093c1"} Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.441230 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f675076d96f0d80dec8b437957588d6a2cd269730bfb83e736cbc1f9dc1093c1" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.441290 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.542616 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4"] Mar 13 14:35:05 crc kubenswrapper[4898]: E0313 14:35:05.543194 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a674c4a-b209-4ea0-83b0-c46f820a81ef" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.543211 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a674c4a-b209-4ea0-83b0-c46f820a81ef" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.543445 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a674c4a-b209-4ea0-83b0-c46f820a81ef" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.544356 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.548381 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.548526 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.548757 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.548979 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.549125 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.549249 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.549355 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.549468 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.549573 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.554628 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4"] Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.715346 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.715403 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.715441 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.715489 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716137 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716250 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716313 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716400 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716473 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716651 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716697 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716793 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716869 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716981 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.717139 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.717346 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8wdb\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-kube-api-access-j8wdb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.749379 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:35:05 crc kubenswrapper[4898]: E0313 14:35:05.749796 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819377 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8wdb\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-kube-api-access-j8wdb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819445 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819466 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819485 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819519 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819567 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819608 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819628 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819653 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819737 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819757 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819784 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819816 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819849 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819884 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.825159 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.825190 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.825198 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.825380 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.825496 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.825606 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.825702 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.825777 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.826048 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.827151 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.828358 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.828380 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.828434 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.830859 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.831715 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.833354 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.835054 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.835841 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.836494 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.837308 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.838947 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.840025 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.847748 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8wdb\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-kube-api-access-j8wdb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.862931 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.870391 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:06 crc kubenswrapper[4898]: I0313 14:35:06.465392 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn2m4" event={"ID":"f6bb9c39-7999-48d1-9223-d7408aa31f47","Type":"ContainerStarted","Data":"f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff"} Mar 13 14:35:06 crc kubenswrapper[4898]: I0313 14:35:06.480206 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4"] Mar 13 14:35:06 crc kubenswrapper[4898]: I0313 14:35:06.983041 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:35:07 crc kubenswrapper[4898]: I0313 14:35:07.490674 4898 generic.go:334] "Generic (PLEG): container finished" podID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerID="f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff" exitCode=0 Mar 13 14:35:07 crc kubenswrapper[4898]: I0313 14:35:07.490815 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn2m4" event={"ID":"f6bb9c39-7999-48d1-9223-d7408aa31f47","Type":"ContainerDied","Data":"f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff"} Mar 13 14:35:07 crc kubenswrapper[4898]: I0313 14:35:07.494439 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" event={"ID":"8c6bec5a-faac-4793-8c18-9f5b2faf2c95","Type":"ContainerStarted","Data":"4d4d0c6bb15a7ffb8076e9638afa814524f427eb62b0c041389d30a596cbe573"} Mar 13 14:35:07 crc kubenswrapper[4898]: I0313 14:35:07.494495 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" event={"ID":"8c6bec5a-faac-4793-8c18-9f5b2faf2c95","Type":"ContainerStarted","Data":"0f88e54431807638c0cb8ce49f8e1de1c35418597acc9f02e4aaae31eeb717ac"} Mar 13 14:35:07 crc kubenswrapper[4898]: I0313 14:35:07.571506 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" podStartSLOduration=2.070900767 podStartE2EDuration="2.57147057s" podCreationTimestamp="2026-03-13 14:35:05 +0000 UTC" firstStartedPulling="2026-03-13 14:35:06.47951496 +0000 UTC m=+2341.481103209" lastFinishedPulling="2026-03-13 14:35:06.980084763 +0000 UTC m=+2341.981673012" observedRunningTime="2026-03-13 14:35:07.55643724 +0000 UTC m=+2342.558025499" watchObservedRunningTime="2026-03-13 14:35:07.57147057 +0000 UTC m=+2342.573058849" Mar 13 14:35:08 crc kubenswrapper[4898]: I0313 14:35:08.512824 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn2m4" event={"ID":"f6bb9c39-7999-48d1-9223-d7408aa31f47","Type":"ContainerStarted","Data":"3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493"} Mar 13 14:35:08 crc kubenswrapper[4898]: I0313 14:35:08.541018 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pn2m4" podStartSLOduration=3.019966021 podStartE2EDuration="6.54099266s" podCreationTimestamp="2026-03-13 14:35:02 +0000 UTC" firstStartedPulling="2026-03-13 14:35:04.422202034 +0000 UTC m=+2339.423790273" lastFinishedPulling="2026-03-13 14:35:07.943228653 +0000 UTC m=+2342.944816912" observedRunningTime="2026-03-13 14:35:08.538332073 +0000 UTC m=+2343.539920352" watchObservedRunningTime="2026-03-13 14:35:08.54099266 +0000 UTC m=+2343.542580929" Mar 13 14:35:13 crc kubenswrapper[4898]: I0313 14:35:13.126446 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:13 crc kubenswrapper[4898]: I0313 14:35:13.128445 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:13 crc kubenswrapper[4898]: I0313 14:35:13.215281 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:13 crc kubenswrapper[4898]: I0313 14:35:13.642072 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:13 crc kubenswrapper[4898]: I0313 14:35:13.698968 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pn2m4"] Mar 13 14:35:15 crc kubenswrapper[4898]: I0313 14:35:15.600600 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pn2m4" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerName="registry-server" containerID="cri-o://3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493" gracePeriod=2 Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.189832 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.350197 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx6lb\" (UniqueName: \"kubernetes.io/projected/f6bb9c39-7999-48d1-9223-d7408aa31f47-kube-api-access-kx6lb\") pod \"f6bb9c39-7999-48d1-9223-d7408aa31f47\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.351428 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-utilities\") pod \"f6bb9c39-7999-48d1-9223-d7408aa31f47\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.351458 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-catalog-content\") pod \"f6bb9c39-7999-48d1-9223-d7408aa31f47\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.352586 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-utilities" (OuterVolumeSpecName: "utilities") pod "f6bb9c39-7999-48d1-9223-d7408aa31f47" (UID: "f6bb9c39-7999-48d1-9223-d7408aa31f47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.359853 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6bb9c39-7999-48d1-9223-d7408aa31f47-kube-api-access-kx6lb" (OuterVolumeSpecName: "kube-api-access-kx6lb") pod "f6bb9c39-7999-48d1-9223-d7408aa31f47" (UID: "f6bb9c39-7999-48d1-9223-d7408aa31f47"). InnerVolumeSpecName "kube-api-access-kx6lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.423578 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6bb9c39-7999-48d1-9223-d7408aa31f47" (UID: "f6bb9c39-7999-48d1-9223-d7408aa31f47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.454477 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx6lb\" (UniqueName: \"kubernetes.io/projected/f6bb9c39-7999-48d1-9223-d7408aa31f47-kube-api-access-kx6lb\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.454506 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.454516 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.613306 4898 generic.go:334] "Generic (PLEG): container finished" podID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerID="3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493" exitCode=0 Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.613346 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn2m4" event={"ID":"f6bb9c39-7999-48d1-9223-d7408aa31f47","Type":"ContainerDied","Data":"3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493"} Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.613372 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn2m4" event={"ID":"f6bb9c39-7999-48d1-9223-d7408aa31f47","Type":"ContainerDied","Data":"bfcd0705440e9b0ea6531f51eb75c71426d7ce4348588ec527180e05d3f093f4"} Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.613388 4898 scope.go:117] "RemoveContainer" containerID="3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.613510 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.658269 4898 scope.go:117] "RemoveContainer" containerID="f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.660104 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pn2m4"] Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.671106 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pn2m4"] Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.692524 4898 scope.go:117] "RemoveContainer" containerID="6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.763480 4898 scope.go:117] "RemoveContainer" containerID="3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493" Mar 13 14:35:16 crc kubenswrapper[4898]: E0313 14:35:16.765102 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493\": container with ID starting with 3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493 not found: ID does not exist" containerID="3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.765141 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493"} err="failed to get container status \"3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493\": rpc error: code = NotFound desc = could not find container \"3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493\": container with ID starting with 3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493 not found: ID does not exist" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.765165 4898 scope.go:117] "RemoveContainer" containerID="f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff" Mar 13 14:35:16 crc kubenswrapper[4898]: E0313 14:35:16.765551 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff\": container with ID starting with f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff not found: ID does not exist" containerID="f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.765573 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff"} err="failed to get container status \"f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff\": rpc error: code = NotFound desc = could not find container \"f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff\": container with ID starting with f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff not found: ID does not exist" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.765585 4898 scope.go:117] "RemoveContainer" containerID="6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73" Mar 13 14:35:16 crc kubenswrapper[4898]: E0313 14:35:16.766159 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73\": container with ID starting with 6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73 not found: ID does not exist" containerID="6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.766181 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73"} err="failed to get container status \"6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73\": rpc error: code = NotFound desc = could not find container \"6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73\": container with ID starting with 6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73 not found: ID does not exist" Mar 13 14:35:17 crc kubenswrapper[4898]: I0313 14:35:17.762021 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" path="/var/lib/kubelet/pods/f6bb9c39-7999-48d1-9223-d7408aa31f47/volumes" Mar 13 14:35:20 crc kubenswrapper[4898]: I0313 14:35:20.742065 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:35:20 crc kubenswrapper[4898]: E0313 14:35:20.742593 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:35:34 crc kubenswrapper[4898]: I0313 14:35:34.741254 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:35:34 crc kubenswrapper[4898]: E0313 14:35:34.742263 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:35:41 crc kubenswrapper[4898]: I0313 14:35:41.101450 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-kxtcf"] Mar 13 14:35:41 crc kubenswrapper[4898]: I0313 14:35:41.117967 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-kxtcf"] Mar 13 14:35:41 crc kubenswrapper[4898]: I0313 14:35:41.759201 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd78a2a-1bb4-461a-92cd-d705080b087a" path="/var/lib/kubelet/pods/2cd78a2a-1bb4-461a-92cd-d705080b087a/volumes" Mar 13 14:35:49 crc kubenswrapper[4898]: I0313 14:35:49.739998 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:35:49 crc kubenswrapper[4898]: E0313 14:35:49.740776 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:35:52 crc kubenswrapper[4898]: I0313 14:35:52.072388 4898 generic.go:334] "Generic (PLEG): container finished" podID="8c6bec5a-faac-4793-8c18-9f5b2faf2c95" containerID="4d4d0c6bb15a7ffb8076e9638afa814524f427eb62b0c041389d30a596cbe573" exitCode=0 Mar 13 14:35:52 crc kubenswrapper[4898]: I0313 14:35:52.072447 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" event={"ID":"8c6bec5a-faac-4793-8c18-9f5b2faf2c95","Type":"ContainerDied","Data":"4d4d0c6bb15a7ffb8076e9638afa814524f427eb62b0c041389d30a596cbe573"} Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.618384 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.760684 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-combined-ca-bundle\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.760766 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8wdb\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-kube-api-access-j8wdb\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.760819 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.762805 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-nova-combined-ca-bundle\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.762882 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-inventory\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.762946 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ovn-combined-ca-bundle\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.762993 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-power-monitoring-combined-ca-bundle\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763041 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-repo-setup-combined-ca-bundle\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763070 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763156 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-neutron-metadata-combined-ca-bundle\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763192 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-ovn-default-certs-0\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763254 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763302 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-bootstrap-combined-ca-bundle\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763353 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763440 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-libvirt-combined-ca-bundle\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763466 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ssh-key-openstack-edpm-ipam\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.770040 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-kube-api-access-j8wdb" (OuterVolumeSpecName: "kube-api-access-j8wdb") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "kube-api-access-j8wdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.770519 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.771714 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.773385 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.773926 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.774016 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.774077 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.775564 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.775716 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.776755 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.777181 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.777436 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.778143 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.784919 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.805185 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-inventory" (OuterVolumeSpecName: "inventory") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.806367 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868508 4898 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868564 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8wdb\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-kube-api-access-j8wdb\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868585 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868604 4898 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868621 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868634 4898 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868649 4898 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868667 4898 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868682 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868699 4898 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868712 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868728 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868742 4898 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868755 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868766 4898 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868780 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.102862 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" event={"ID":"8c6bec5a-faac-4793-8c18-9f5b2faf2c95","Type":"ContainerDied","Data":"0f88e54431807638c0cb8ce49f8e1de1c35418597acc9f02e4aaae31eeb717ac"} Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.102942 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f88e54431807638c0cb8ce49f8e1de1c35418597acc9f02e4aaae31eeb717ac" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.103080 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.242981 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx"] Mar 13 14:35:54 crc kubenswrapper[4898]: E0313 14:35:54.244091 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerName="extract-utilities" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.244116 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerName="extract-utilities" Mar 13 14:35:54 crc kubenswrapper[4898]: E0313 14:35:54.244128 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerName="extract-content" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.244135 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerName="extract-content" Mar 13 14:35:54 crc kubenswrapper[4898]: E0313 14:35:54.244153 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerName="registry-server" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.244161 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerName="registry-server" Mar 13 14:35:54 crc kubenswrapper[4898]: E0313 14:35:54.244183 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6bec5a-faac-4793-8c18-9f5b2faf2c95" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.244195 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6bec5a-faac-4793-8c18-9f5b2faf2c95" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.244512 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerName="registry-server" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.244531 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6bec5a-faac-4793-8c18-9f5b2faf2c95" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.245820 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.248978 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.249323 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.249585 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.249866 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.251302 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.253572 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx"] Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.387370 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.387451 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.387540 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26rhl\" (UniqueName: \"kubernetes.io/projected/a9f7be15-746c-45be-92a1-2fa2a961f636-kube-api-access-26rhl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.387681 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.387796 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a9f7be15-746c-45be-92a1-2fa2a961f636-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.491216 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.491418 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.491548 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26rhl\" (UniqueName: \"kubernetes.io/projected/a9f7be15-746c-45be-92a1-2fa2a961f636-kube-api-access-26rhl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.491760 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.491871 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a9f7be15-746c-45be-92a1-2fa2a961f636-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.493550 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a9f7be15-746c-45be-92a1-2fa2a961f636-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.497731 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.498208 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.503330 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.518674 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26rhl\" (UniqueName: \"kubernetes.io/projected/a9f7be15-746c-45be-92a1-2fa2a961f636-kube-api-access-26rhl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.568823 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:55 crc kubenswrapper[4898]: I0313 14:35:55.213938 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx"] Mar 13 14:35:55 crc kubenswrapper[4898]: W0313 14:35:55.226818 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9f7be15_746c_45be_92a1_2fa2a961f636.slice/crio-9b1eedb3314bba2694120e0c082f648d290dda603ff79d45db17c947105d1cd4 WatchSource:0}: Error finding container 9b1eedb3314bba2694120e0c082f648d290dda603ff79d45db17c947105d1cd4: Status 404 returned error can't find the container with id 9b1eedb3314bba2694120e0c082f648d290dda603ff79d45db17c947105d1cd4 Mar 13 14:35:56 crc kubenswrapper[4898]: I0313 14:35:56.137018 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" event={"ID":"a9f7be15-746c-45be-92a1-2fa2a961f636","Type":"ContainerStarted","Data":"0b3a8f0331d4cc53c2a293a3d7861b92f3ef327313fba53309af365c126af301"} Mar 13 14:35:56 crc kubenswrapper[4898]: I0313 14:35:56.137318 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" event={"ID":"a9f7be15-746c-45be-92a1-2fa2a961f636","Type":"ContainerStarted","Data":"9b1eedb3314bba2694120e0c082f648d290dda603ff79d45db17c947105d1cd4"} Mar 13 14:35:56 crc kubenswrapper[4898]: I0313 14:35:56.167199 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" podStartSLOduration=1.66787519 podStartE2EDuration="2.167174862s" podCreationTimestamp="2026-03-13 14:35:54 +0000 UTC" firstStartedPulling="2026-03-13 14:35:55.230632574 +0000 UTC m=+2390.232220823" lastFinishedPulling="2026-03-13 14:35:55.729932246 +0000 UTC m=+2390.731520495" observedRunningTime="2026-03-13 14:35:56.153937268 +0000 UTC m=+2391.155525517" watchObservedRunningTime="2026-03-13 14:35:56.167174862 +0000 UTC m=+2391.168763111" Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.140460 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556876-j2sld"] Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.142888 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556876-j2sld" Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.147200 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.147371 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.147526 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.155137 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556876-j2sld"] Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.184229 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxm4p\" (UniqueName: \"kubernetes.io/projected/e83b21e9-13bc-4f80-a228-126fbc98c8f6-kube-api-access-nxm4p\") pod \"auto-csr-approver-29556876-j2sld\" (UID: \"e83b21e9-13bc-4f80-a228-126fbc98c8f6\") " pod="openshift-infra/auto-csr-approver-29556876-j2sld" Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.286654 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxm4p\" (UniqueName: \"kubernetes.io/projected/e83b21e9-13bc-4f80-a228-126fbc98c8f6-kube-api-access-nxm4p\") pod \"auto-csr-approver-29556876-j2sld\" (UID: \"e83b21e9-13bc-4f80-a228-126fbc98c8f6\") " pod="openshift-infra/auto-csr-approver-29556876-j2sld" Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.311855 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxm4p\" (UniqueName: \"kubernetes.io/projected/e83b21e9-13bc-4f80-a228-126fbc98c8f6-kube-api-access-nxm4p\") pod \"auto-csr-approver-29556876-j2sld\" (UID: \"e83b21e9-13bc-4f80-a228-126fbc98c8f6\") " pod="openshift-infra/auto-csr-approver-29556876-j2sld" Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.485063 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556876-j2sld" Mar 13 14:36:01 crc kubenswrapper[4898]: I0313 14:36:01.010296 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556876-j2sld"] Mar 13 14:36:01 crc kubenswrapper[4898]: W0313 14:36:01.018085 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode83b21e9_13bc_4f80_a228_126fbc98c8f6.slice/crio-f6d583d9ca13af4b6dadd70e08a52a1d4fd61e75828f367de6d9ea530df49b9c WatchSource:0}: Error finding container f6d583d9ca13af4b6dadd70e08a52a1d4fd61e75828f367de6d9ea530df49b9c: Status 404 returned error can't find the container with id f6d583d9ca13af4b6dadd70e08a52a1d4fd61e75828f367de6d9ea530df49b9c Mar 13 14:36:01 crc kubenswrapper[4898]: I0313 14:36:01.221515 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556876-j2sld" event={"ID":"e83b21e9-13bc-4f80-a228-126fbc98c8f6","Type":"ContainerStarted","Data":"f6d583d9ca13af4b6dadd70e08a52a1d4fd61e75828f367de6d9ea530df49b9c"} Mar 13 14:36:01 crc kubenswrapper[4898]: I0313 14:36:01.752117 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:36:01 crc kubenswrapper[4898]: E0313 14:36:01.755431 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:36:03 crc kubenswrapper[4898]: I0313 14:36:03.252227 4898 generic.go:334] "Generic (PLEG): container finished" podID="e83b21e9-13bc-4f80-a228-126fbc98c8f6" containerID="880ec0d7753626dc3ced87b5a1086a85612fac87d9f534c6f11457452f7a1041" exitCode=0 Mar 13 14:36:03 crc kubenswrapper[4898]: I0313 14:36:03.252597 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556876-j2sld" event={"ID":"e83b21e9-13bc-4f80-a228-126fbc98c8f6","Type":"ContainerDied","Data":"880ec0d7753626dc3ced87b5a1086a85612fac87d9f534c6f11457452f7a1041"} Mar 13 14:36:04 crc kubenswrapper[4898]: I0313 14:36:04.831197 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556876-j2sld" Mar 13 14:36:05 crc kubenswrapper[4898]: I0313 14:36:05.014269 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxm4p\" (UniqueName: \"kubernetes.io/projected/e83b21e9-13bc-4f80-a228-126fbc98c8f6-kube-api-access-nxm4p\") pod \"e83b21e9-13bc-4f80-a228-126fbc98c8f6\" (UID: \"e83b21e9-13bc-4f80-a228-126fbc98c8f6\") " Mar 13 14:36:05 crc kubenswrapper[4898]: I0313 14:36:05.021793 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e83b21e9-13bc-4f80-a228-126fbc98c8f6-kube-api-access-nxm4p" (OuterVolumeSpecName: "kube-api-access-nxm4p") pod "e83b21e9-13bc-4f80-a228-126fbc98c8f6" (UID: "e83b21e9-13bc-4f80-a228-126fbc98c8f6"). InnerVolumeSpecName "kube-api-access-nxm4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:36:05 crc kubenswrapper[4898]: I0313 14:36:05.117881 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxm4p\" (UniqueName: \"kubernetes.io/projected/e83b21e9-13bc-4f80-a228-126fbc98c8f6-kube-api-access-nxm4p\") on node \"crc\" DevicePath \"\"" Mar 13 14:36:05 crc kubenswrapper[4898]: I0313 14:36:05.279275 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556876-j2sld" event={"ID":"e83b21e9-13bc-4f80-a228-126fbc98c8f6","Type":"ContainerDied","Data":"f6d583d9ca13af4b6dadd70e08a52a1d4fd61e75828f367de6d9ea530df49b9c"} Mar 13 14:36:05 crc kubenswrapper[4898]: I0313 14:36:05.279595 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6d583d9ca13af4b6dadd70e08a52a1d4fd61e75828f367de6d9ea530df49b9c" Mar 13 14:36:05 crc kubenswrapper[4898]: I0313 14:36:05.279346 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556876-j2sld" Mar 13 14:36:05 crc kubenswrapper[4898]: I0313 14:36:05.922197 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556870-ndlzh"] Mar 13 14:36:05 crc kubenswrapper[4898]: I0313 14:36:05.932199 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556870-ndlzh"] Mar 13 14:36:07 crc kubenswrapper[4898]: I0313 14:36:07.766701 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4f21c0b-a6a1-4b44-ae38-4a382569154e" path="/var/lib/kubelet/pods/c4f21c0b-a6a1-4b44-ae38-4a382569154e/volumes" Mar 13 14:36:13 crc kubenswrapper[4898]: I0313 14:36:13.740495 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:36:13 crc kubenswrapper[4898]: E0313 14:36:13.741399 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:36:21 crc kubenswrapper[4898]: I0313 14:36:21.067275 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-rzjjp"] Mar 13 14:36:21 crc kubenswrapper[4898]: I0313 14:36:21.079653 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-rzjjp"] Mar 13 14:36:21 crc kubenswrapper[4898]: I0313 14:36:21.750283 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" path="/var/lib/kubelet/pods/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe/volumes" Mar 13 14:36:24 crc kubenswrapper[4898]: I0313 14:36:24.740568 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:36:24 crc kubenswrapper[4898]: E0313 14:36:24.742516 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:36:37 crc kubenswrapper[4898]: I0313 14:36:37.740834 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:36:37 crc kubenswrapper[4898]: E0313 14:36:37.742457 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:36:40 crc kubenswrapper[4898]: I0313 14:36:40.479451 4898 scope.go:117] "RemoveContainer" containerID="797b7877d34540edd204a0c5f49e93f47ceac114fc9a4ba968964ef3cae04ffa" Mar 13 14:36:40 crc kubenswrapper[4898]: I0313 14:36:40.525957 4898 scope.go:117] "RemoveContainer" containerID="b1aa895f0022b3e7758a22e19e58e18bbca3560c415ba389688c7c2191911abd" Mar 13 14:36:40 crc kubenswrapper[4898]: I0313 14:36:40.594313 4898 scope.go:117] "RemoveContainer" containerID="86e66a360586b19f53ca12cefc2c560bd5016283db35bb1e56ee1d68892fd634" Mar 13 14:36:50 crc kubenswrapper[4898]: I0313 14:36:50.739746 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:36:50 crc kubenswrapper[4898]: E0313 14:36:50.742309 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:37:01 crc kubenswrapper[4898]: I0313 14:37:01.990429 4898 generic.go:334] "Generic (PLEG): container finished" podID="a9f7be15-746c-45be-92a1-2fa2a961f636" containerID="0b3a8f0331d4cc53c2a293a3d7861b92f3ef327313fba53309af365c126af301" exitCode=0 Mar 13 14:37:01 crc kubenswrapper[4898]: I0313 14:37:01.990544 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" event={"ID":"a9f7be15-746c-45be-92a1-2fa2a961f636","Type":"ContainerDied","Data":"0b3a8f0331d4cc53c2a293a3d7861b92f3ef327313fba53309af365c126af301"} Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.519348 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.693867 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-inventory\") pod \"a9f7be15-746c-45be-92a1-2fa2a961f636\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.694043 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ovn-combined-ca-bundle\") pod \"a9f7be15-746c-45be-92a1-2fa2a961f636\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.694096 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a9f7be15-746c-45be-92a1-2fa2a961f636-ovncontroller-config-0\") pod \"a9f7be15-746c-45be-92a1-2fa2a961f636\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.694144 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26rhl\" (UniqueName: \"kubernetes.io/projected/a9f7be15-746c-45be-92a1-2fa2a961f636-kube-api-access-26rhl\") pod \"a9f7be15-746c-45be-92a1-2fa2a961f636\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.694262 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ssh-key-openstack-edpm-ipam\") pod \"a9f7be15-746c-45be-92a1-2fa2a961f636\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.701024 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f7be15-746c-45be-92a1-2fa2a961f636-kube-api-access-26rhl" (OuterVolumeSpecName: "kube-api-access-26rhl") pod "a9f7be15-746c-45be-92a1-2fa2a961f636" (UID: "a9f7be15-746c-45be-92a1-2fa2a961f636"). InnerVolumeSpecName "kube-api-access-26rhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.704283 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a9f7be15-746c-45be-92a1-2fa2a961f636" (UID: "a9f7be15-746c-45be-92a1-2fa2a961f636"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.726746 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9f7be15-746c-45be-92a1-2fa2a961f636-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a9f7be15-746c-45be-92a1-2fa2a961f636" (UID: "a9f7be15-746c-45be-92a1-2fa2a961f636"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.733504 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-inventory" (OuterVolumeSpecName: "inventory") pod "a9f7be15-746c-45be-92a1-2fa2a961f636" (UID: "a9f7be15-746c-45be-92a1-2fa2a961f636"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.753025 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a9f7be15-746c-45be-92a1-2fa2a961f636" (UID: "a9f7be15-746c-45be-92a1-2fa2a961f636"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.798507 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26rhl\" (UniqueName: \"kubernetes.io/projected/a9f7be15-746c-45be-92a1-2fa2a961f636-kube-api-access-26rhl\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.798594 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.798625 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.798713 4898 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.798749 4898 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a9f7be15-746c-45be-92a1-2fa2a961f636-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.015621 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" event={"ID":"a9f7be15-746c-45be-92a1-2fa2a961f636","Type":"ContainerDied","Data":"9b1eedb3314bba2694120e0c082f648d290dda603ff79d45db17c947105d1cd4"} Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.015662 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b1eedb3314bba2694120e0c082f648d290dda603ff79d45db17c947105d1cd4" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.016306 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.252741 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2"] Mar 13 14:37:04 crc kubenswrapper[4898]: E0313 14:37:04.254057 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83b21e9-13bc-4f80-a228-126fbc98c8f6" containerName="oc" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.254084 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83b21e9-13bc-4f80-a228-126fbc98c8f6" containerName="oc" Mar 13 14:37:04 crc kubenswrapper[4898]: E0313 14:37:04.254127 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f7be15-746c-45be-92a1-2fa2a961f636" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.254135 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f7be15-746c-45be-92a1-2fa2a961f636" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.254382 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83b21e9-13bc-4f80-a228-126fbc98c8f6" containerName="oc" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.254407 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f7be15-746c-45be-92a1-2fa2a961f636" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.255248 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.258531 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.258633 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.259033 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.259200 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.259205 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.262470 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.298720 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2"] Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.413964 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.414415 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkwxx\" (UniqueName: \"kubernetes.io/projected/abb37cb2-ec06-4c96-882f-7781fbe053e0-kube-api-access-zkwxx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.414464 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.414548 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.414579 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.414606 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.516707 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.516842 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkwxx\" (UniqueName: \"kubernetes.io/projected/abb37cb2-ec06-4c96-882f-7781fbe053e0-kube-api-access-zkwxx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.516887 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.516959 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.516987 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.517015 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.521164 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.522525 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.522727 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.523224 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.527706 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.536691 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkwxx\" (UniqueName: \"kubernetes.io/projected/abb37cb2-ec06-4c96-882f-7781fbe053e0-kube-api-access-zkwxx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.575233 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.740375 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:37:04 crc kubenswrapper[4898]: E0313 14:37:04.741165 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:37:05 crc kubenswrapper[4898]: I0313 14:37:05.124381 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2"] Mar 13 14:37:05 crc kubenswrapper[4898]: I0313 14:37:05.136247 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:37:06 crc kubenswrapper[4898]: I0313 14:37:06.040613 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" event={"ID":"abb37cb2-ec06-4c96-882f-7781fbe053e0","Type":"ContainerStarted","Data":"f6584ee1397c9a4b280cbe5d9477967c5edf6ee4ae6ac58ff2fdd8dbdfaa85a0"} Mar 13 14:37:06 crc kubenswrapper[4898]: I0313 14:37:06.041115 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" event={"ID":"abb37cb2-ec06-4c96-882f-7781fbe053e0","Type":"ContainerStarted","Data":"cb1aca5bdad3fc291c97dcbe66d830c038286152338aad152dcc9c7f0a0c4841"} Mar 13 14:37:06 crc kubenswrapper[4898]: I0313 14:37:06.072452 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" podStartSLOduration=1.6204208150000001 podStartE2EDuration="2.072431174s" podCreationTimestamp="2026-03-13 14:37:04 +0000 UTC" firstStartedPulling="2026-03-13 14:37:05.136011709 +0000 UTC m=+2460.137599948" lastFinishedPulling="2026-03-13 14:37:05.588022028 +0000 UTC m=+2460.589610307" observedRunningTime="2026-03-13 14:37:06.057506587 +0000 UTC m=+2461.059094826" watchObservedRunningTime="2026-03-13 14:37:06.072431174 +0000 UTC m=+2461.074019423" Mar 13 14:37:17 crc kubenswrapper[4898]: I0313 14:37:17.739789 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:37:17 crc kubenswrapper[4898]: E0313 14:37:17.740983 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:37:31 crc kubenswrapper[4898]: I0313 14:37:31.741172 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:37:31 crc kubenswrapper[4898]: E0313 14:37:31.743296 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:37:46 crc kubenswrapper[4898]: I0313 14:37:46.740224 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:37:46 crc kubenswrapper[4898]: E0313 14:37:46.741275 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:37:56 crc kubenswrapper[4898]: I0313 14:37:56.714845 4898 generic.go:334] "Generic (PLEG): container finished" podID="abb37cb2-ec06-4c96-882f-7781fbe053e0" containerID="f6584ee1397c9a4b280cbe5d9477967c5edf6ee4ae6ac58ff2fdd8dbdfaa85a0" exitCode=0 Mar 13 14:37:56 crc kubenswrapper[4898]: I0313 14:37:56.714977 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" event={"ID":"abb37cb2-ec06-4c96-882f-7781fbe053e0","Type":"ContainerDied","Data":"f6584ee1397c9a4b280cbe5d9477967c5edf6ee4ae6ac58ff2fdd8dbdfaa85a0"} Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.305304 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.407539 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkwxx\" (UniqueName: \"kubernetes.io/projected/abb37cb2-ec06-4c96-882f-7781fbe053e0-kube-api-access-zkwxx\") pod \"abb37cb2-ec06-4c96-882f-7781fbe053e0\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.408216 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"abb37cb2-ec06-4c96-882f-7781fbe053e0\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.408709 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-nova-metadata-neutron-config-0\") pod \"abb37cb2-ec06-4c96-882f-7781fbe053e0\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.408826 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-inventory\") pod \"abb37cb2-ec06-4c96-882f-7781fbe053e0\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.409002 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-metadata-combined-ca-bundle\") pod \"abb37cb2-ec06-4c96-882f-7781fbe053e0\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.409059 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-ssh-key-openstack-edpm-ipam\") pod \"abb37cb2-ec06-4c96-882f-7781fbe053e0\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.413548 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abb37cb2-ec06-4c96-882f-7781fbe053e0-kube-api-access-zkwxx" (OuterVolumeSpecName: "kube-api-access-zkwxx") pod "abb37cb2-ec06-4c96-882f-7781fbe053e0" (UID: "abb37cb2-ec06-4c96-882f-7781fbe053e0"). InnerVolumeSpecName "kube-api-access-zkwxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.417111 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "abb37cb2-ec06-4c96-882f-7781fbe053e0" (UID: "abb37cb2-ec06-4c96-882f-7781fbe053e0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.447103 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "abb37cb2-ec06-4c96-882f-7781fbe053e0" (UID: "abb37cb2-ec06-4c96-882f-7781fbe053e0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.455713 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "abb37cb2-ec06-4c96-882f-7781fbe053e0" (UID: "abb37cb2-ec06-4c96-882f-7781fbe053e0"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.464722 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "abb37cb2-ec06-4c96-882f-7781fbe053e0" (UID: "abb37cb2-ec06-4c96-882f-7781fbe053e0"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.482939 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-inventory" (OuterVolumeSpecName: "inventory") pod "abb37cb2-ec06-4c96-882f-7781fbe053e0" (UID: "abb37cb2-ec06-4c96-882f-7781fbe053e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.513217 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.513258 4898 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.513278 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.513293 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkwxx\" (UniqueName: \"kubernetes.io/projected/abb37cb2-ec06-4c96-882f-7781fbe053e0-kube-api-access-zkwxx\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.513306 4898 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.513319 4898 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.741075 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:37:58 crc kubenswrapper[4898]: E0313 14:37:58.741935 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.743793 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" event={"ID":"abb37cb2-ec06-4c96-882f-7781fbe053e0","Type":"ContainerDied","Data":"cb1aca5bdad3fc291c97dcbe66d830c038286152338aad152dcc9c7f0a0c4841"} Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.743845 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.743857 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb1aca5bdad3fc291c97dcbe66d830c038286152338aad152dcc9c7f0a0c4841" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.873927 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z"] Mar 13 14:37:58 crc kubenswrapper[4898]: E0313 14:37:58.874660 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abb37cb2-ec06-4c96-882f-7781fbe053e0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.874694 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb37cb2-ec06-4c96-882f-7781fbe053e0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.874960 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="abb37cb2-ec06-4c96-882f-7781fbe053e0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.875813 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.882400 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.882677 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.882729 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.882796 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.886497 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.887518 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z"] Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.032180 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.032319 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plnfv\" (UniqueName: \"kubernetes.io/projected/226c01c4-d0f3-4784-8e93-36d1de6d593f-kube-api-access-plnfv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.032596 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.032657 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.032720 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.135828 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.135891 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.135951 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.136088 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.136135 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plnfv\" (UniqueName: \"kubernetes.io/projected/226c01c4-d0f3-4784-8e93-36d1de6d593f-kube-api-access-plnfv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.140812 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.140869 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.142972 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.153019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.155167 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plnfv\" (UniqueName: \"kubernetes.io/projected/226c01c4-d0f3-4784-8e93-36d1de6d593f-kube-api-access-plnfv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.201670 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: W0313 14:37:59.858568 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod226c01c4_d0f3_4784_8e93_36d1de6d593f.slice/crio-c5ed48dbe017708bc1a827b7644836fd3073060755f704b693ca4fab0b30fb9b WatchSource:0}: Error finding container c5ed48dbe017708bc1a827b7644836fd3073060755f704b693ca4fab0b30fb9b: Status 404 returned error can't find the container with id c5ed48dbe017708bc1a827b7644836fd3073060755f704b693ca4fab0b30fb9b Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.869368 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z"] Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.153885 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556878-btqw2"] Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.157167 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556878-btqw2" Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.160852 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.161116 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.162942 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.188928 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556878-btqw2"] Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.271373 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9dd8\" (UniqueName: \"kubernetes.io/projected/39249464-ab82-4938-978e-2ffcbc637f4f-kube-api-access-b9dd8\") pod \"auto-csr-approver-29556878-btqw2\" (UID: \"39249464-ab82-4938-978e-2ffcbc637f4f\") " pod="openshift-infra/auto-csr-approver-29556878-btqw2" Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.374290 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9dd8\" (UniqueName: \"kubernetes.io/projected/39249464-ab82-4938-978e-2ffcbc637f4f-kube-api-access-b9dd8\") pod \"auto-csr-approver-29556878-btqw2\" (UID: \"39249464-ab82-4938-978e-2ffcbc637f4f\") " pod="openshift-infra/auto-csr-approver-29556878-btqw2" Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.402882 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9dd8\" (UniqueName: \"kubernetes.io/projected/39249464-ab82-4938-978e-2ffcbc637f4f-kube-api-access-b9dd8\") pod \"auto-csr-approver-29556878-btqw2\" (UID: \"39249464-ab82-4938-978e-2ffcbc637f4f\") " pod="openshift-infra/auto-csr-approver-29556878-btqw2" Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.508141 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556878-btqw2" Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.772841 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" event={"ID":"226c01c4-d0f3-4784-8e93-36d1de6d593f","Type":"ContainerStarted","Data":"8d123c34cb14e74bcadc841aa33ca89cf1efe34ff10b94c6d0b5690f3c9b0353"} Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.773226 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" event={"ID":"226c01c4-d0f3-4784-8e93-36d1de6d593f","Type":"ContainerStarted","Data":"c5ed48dbe017708bc1a827b7644836fd3073060755f704b693ca4fab0b30fb9b"} Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.793667 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" podStartSLOduration=2.325259152 podStartE2EDuration="2.793648183s" podCreationTimestamp="2026-03-13 14:37:58 +0000 UTC" firstStartedPulling="2026-03-13 14:37:59.861634554 +0000 UTC m=+2514.863222823" lastFinishedPulling="2026-03-13 14:38:00.330023605 +0000 UTC m=+2515.331611854" observedRunningTime="2026-03-13 14:38:00.793224202 +0000 UTC m=+2515.794812461" watchObservedRunningTime="2026-03-13 14:38:00.793648183 +0000 UTC m=+2515.795236442" Mar 13 14:38:01 crc kubenswrapper[4898]: I0313 14:38:01.049882 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556878-btqw2"] Mar 13 14:38:01 crc kubenswrapper[4898]: W0313 14:38:01.054912 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39249464_ab82_4938_978e_2ffcbc637f4f.slice/crio-db760510b023c16dd3b8a56b9a79711f2e8f056c4dac065b95f3942e59ac8ca9 WatchSource:0}: Error finding container db760510b023c16dd3b8a56b9a79711f2e8f056c4dac065b95f3942e59ac8ca9: Status 404 returned error can't find the container with id db760510b023c16dd3b8a56b9a79711f2e8f056c4dac065b95f3942e59ac8ca9 Mar 13 14:38:01 crc kubenswrapper[4898]: I0313 14:38:01.790894 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556878-btqw2" event={"ID":"39249464-ab82-4938-978e-2ffcbc637f4f","Type":"ContainerStarted","Data":"db760510b023c16dd3b8a56b9a79711f2e8f056c4dac065b95f3942e59ac8ca9"} Mar 13 14:38:02 crc kubenswrapper[4898]: I0313 14:38:02.811040 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556878-btqw2" event={"ID":"39249464-ab82-4938-978e-2ffcbc637f4f","Type":"ContainerStarted","Data":"9b860152e27ff03f1039fc3f0a1f9cf3aa08903cd38847b8eba8da6dc52d2b6e"} Mar 13 14:38:02 crc kubenswrapper[4898]: I0313 14:38:02.841602 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556878-btqw2" podStartSLOduration=1.720811839 podStartE2EDuration="2.841578899s" podCreationTimestamp="2026-03-13 14:38:00 +0000 UTC" firstStartedPulling="2026-03-13 14:38:01.057891348 +0000 UTC m=+2516.059479597" lastFinishedPulling="2026-03-13 14:38:02.178658378 +0000 UTC m=+2517.180246657" observedRunningTime="2026-03-13 14:38:02.834796715 +0000 UTC m=+2517.836384984" watchObservedRunningTime="2026-03-13 14:38:02.841578899 +0000 UTC m=+2517.843167148" Mar 13 14:38:03 crc kubenswrapper[4898]: I0313 14:38:03.824701 4898 generic.go:334] "Generic (PLEG): container finished" podID="39249464-ab82-4938-978e-2ffcbc637f4f" containerID="9b860152e27ff03f1039fc3f0a1f9cf3aa08903cd38847b8eba8da6dc52d2b6e" exitCode=0 Mar 13 14:38:03 crc kubenswrapper[4898]: I0313 14:38:03.824815 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556878-btqw2" event={"ID":"39249464-ab82-4938-978e-2ffcbc637f4f","Type":"ContainerDied","Data":"9b860152e27ff03f1039fc3f0a1f9cf3aa08903cd38847b8eba8da6dc52d2b6e"} Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.313472 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556878-btqw2" Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.414119 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9dd8\" (UniqueName: \"kubernetes.io/projected/39249464-ab82-4938-978e-2ffcbc637f4f-kube-api-access-b9dd8\") pod \"39249464-ab82-4938-978e-2ffcbc637f4f\" (UID: \"39249464-ab82-4938-978e-2ffcbc637f4f\") " Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.421212 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39249464-ab82-4938-978e-2ffcbc637f4f-kube-api-access-b9dd8" (OuterVolumeSpecName: "kube-api-access-b9dd8") pod "39249464-ab82-4938-978e-2ffcbc637f4f" (UID: "39249464-ab82-4938-978e-2ffcbc637f4f"). InnerVolumeSpecName "kube-api-access-b9dd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.518564 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9dd8\" (UniqueName: \"kubernetes.io/projected/39249464-ab82-4938-978e-2ffcbc637f4f-kube-api-access-b9dd8\") on node \"crc\" DevicePath \"\"" Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.887472 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556878-btqw2" event={"ID":"39249464-ab82-4938-978e-2ffcbc637f4f","Type":"ContainerDied","Data":"db760510b023c16dd3b8a56b9a79711f2e8f056c4dac065b95f3942e59ac8ca9"} Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.887724 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db760510b023c16dd3b8a56b9a79711f2e8f056c4dac065b95f3942e59ac8ca9" Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.887557 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556878-btqw2" Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.916053 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556872-2tqn4"] Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.929214 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556872-2tqn4"] Mar 13 14:38:07 crc kubenswrapper[4898]: I0313 14:38:07.761850 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8627002c-751e-4168-b294-4a324890a996" path="/var/lib/kubelet/pods/8627002c-751e-4168-b294-4a324890a996/volumes" Mar 13 14:38:13 crc kubenswrapper[4898]: I0313 14:38:13.739616 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:38:13 crc kubenswrapper[4898]: E0313 14:38:13.740579 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:38:27 crc kubenswrapper[4898]: I0313 14:38:27.741517 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:38:27 crc kubenswrapper[4898]: E0313 14:38:27.743549 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:38:39 crc kubenswrapper[4898]: I0313 14:38:39.740729 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:38:39 crc kubenswrapper[4898]: E0313 14:38:39.742169 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:38:40 crc kubenswrapper[4898]: I0313 14:38:40.815140 4898 scope.go:117] "RemoveContainer" containerID="9918c054f17d0c467592f1c4b30fc11e333ea544aa38b84c1aab31d2beff7c97" Mar 13 14:38:50 crc kubenswrapper[4898]: I0313 14:38:50.740582 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:38:51 crc kubenswrapper[4898]: I0313 14:38:51.525802 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"610faaa089f666c18409ed2be816ac54810449ed44b96f98a2303a40ac9c9836"} Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.160599 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556880-2rlcl"] Mar 13 14:40:00 crc kubenswrapper[4898]: E0313 14:40:00.164812 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39249464-ab82-4938-978e-2ffcbc637f4f" containerName="oc" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.165335 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="39249464-ab82-4938-978e-2ffcbc637f4f" containerName="oc" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.166341 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="39249464-ab82-4938-978e-2ffcbc637f4f" containerName="oc" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.168011 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556880-2rlcl" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.171153 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.171541 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.174118 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556880-2rlcl"] Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.174362 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.189459 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7f2h\" (UniqueName: \"kubernetes.io/projected/23d4e2ed-7457-458a-9c76-dcf8f3aadd99-kube-api-access-v7f2h\") pod \"auto-csr-approver-29556880-2rlcl\" (UID: \"23d4e2ed-7457-458a-9c76-dcf8f3aadd99\") " pod="openshift-infra/auto-csr-approver-29556880-2rlcl" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.291618 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7f2h\" (UniqueName: \"kubernetes.io/projected/23d4e2ed-7457-458a-9c76-dcf8f3aadd99-kube-api-access-v7f2h\") pod \"auto-csr-approver-29556880-2rlcl\" (UID: \"23d4e2ed-7457-458a-9c76-dcf8f3aadd99\") " pod="openshift-infra/auto-csr-approver-29556880-2rlcl" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.333221 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7f2h\" (UniqueName: \"kubernetes.io/projected/23d4e2ed-7457-458a-9c76-dcf8f3aadd99-kube-api-access-v7f2h\") pod \"auto-csr-approver-29556880-2rlcl\" (UID: \"23d4e2ed-7457-458a-9c76-dcf8f3aadd99\") " pod="openshift-infra/auto-csr-approver-29556880-2rlcl" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.512974 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556880-2rlcl" Mar 13 14:40:01 crc kubenswrapper[4898]: I0313 14:40:01.034989 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556880-2rlcl"] Mar 13 14:40:01 crc kubenswrapper[4898]: W0313 14:40:01.036849 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23d4e2ed_7457_458a_9c76_dcf8f3aadd99.slice/crio-330d3b30cd6ab021edffafbc6168be6383771cabc816664fe7fe8a93e3cf47c4 WatchSource:0}: Error finding container 330d3b30cd6ab021edffafbc6168be6383771cabc816664fe7fe8a93e3cf47c4: Status 404 returned error can't find the container with id 330d3b30cd6ab021edffafbc6168be6383771cabc816664fe7fe8a93e3cf47c4 Mar 13 14:40:01 crc kubenswrapper[4898]: I0313 14:40:01.459297 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556880-2rlcl" event={"ID":"23d4e2ed-7457-458a-9c76-dcf8f3aadd99","Type":"ContainerStarted","Data":"330d3b30cd6ab021edffafbc6168be6383771cabc816664fe7fe8a93e3cf47c4"} Mar 13 14:40:03 crc kubenswrapper[4898]: I0313 14:40:03.486481 4898 generic.go:334] "Generic (PLEG): container finished" podID="23d4e2ed-7457-458a-9c76-dcf8f3aadd99" containerID="a4b672dd5f62f7db5f72a2ba461417e4b17e1ad5affad388a08bfa992e5aa45e" exitCode=0 Mar 13 14:40:03 crc kubenswrapper[4898]: I0313 14:40:03.486556 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556880-2rlcl" event={"ID":"23d4e2ed-7457-458a-9c76-dcf8f3aadd99","Type":"ContainerDied","Data":"a4b672dd5f62f7db5f72a2ba461417e4b17e1ad5affad388a08bfa992e5aa45e"} Mar 13 14:40:04 crc kubenswrapper[4898]: I0313 14:40:04.965209 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556880-2rlcl" Mar 13 14:40:05 crc kubenswrapper[4898]: I0313 14:40:05.133959 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7f2h\" (UniqueName: \"kubernetes.io/projected/23d4e2ed-7457-458a-9c76-dcf8f3aadd99-kube-api-access-v7f2h\") pod \"23d4e2ed-7457-458a-9c76-dcf8f3aadd99\" (UID: \"23d4e2ed-7457-458a-9c76-dcf8f3aadd99\") " Mar 13 14:40:05 crc kubenswrapper[4898]: I0313 14:40:05.155295 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d4e2ed-7457-458a-9c76-dcf8f3aadd99-kube-api-access-v7f2h" (OuterVolumeSpecName: "kube-api-access-v7f2h") pod "23d4e2ed-7457-458a-9c76-dcf8f3aadd99" (UID: "23d4e2ed-7457-458a-9c76-dcf8f3aadd99"). InnerVolumeSpecName "kube-api-access-v7f2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:40:05 crc kubenswrapper[4898]: I0313 14:40:05.238210 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7f2h\" (UniqueName: \"kubernetes.io/projected/23d4e2ed-7457-458a-9c76-dcf8f3aadd99-kube-api-access-v7f2h\") on node \"crc\" DevicePath \"\"" Mar 13 14:40:05 crc kubenswrapper[4898]: I0313 14:40:05.520501 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556880-2rlcl" event={"ID":"23d4e2ed-7457-458a-9c76-dcf8f3aadd99","Type":"ContainerDied","Data":"330d3b30cd6ab021edffafbc6168be6383771cabc816664fe7fe8a93e3cf47c4"} Mar 13 14:40:05 crc kubenswrapper[4898]: I0313 14:40:05.520541 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="330d3b30cd6ab021edffafbc6168be6383771cabc816664fe7fe8a93e3cf47c4" Mar 13 14:40:05 crc kubenswrapper[4898]: I0313 14:40:05.520596 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556880-2rlcl" Mar 13 14:40:06 crc kubenswrapper[4898]: I0313 14:40:06.064364 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556874-5w2n9"] Mar 13 14:40:06 crc kubenswrapper[4898]: I0313 14:40:06.072938 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556874-5w2n9"] Mar 13 14:40:07 crc kubenswrapper[4898]: I0313 14:40:07.790257 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb068c44-8492-4ed4-973b-f1233d9db645" path="/var/lib/kubelet/pods/eb068c44-8492-4ed4-973b-f1233d9db645/volumes" Mar 13 14:40:40 crc kubenswrapper[4898]: I0313 14:40:40.953459 4898 scope.go:117] "RemoveContainer" containerID="408d12ea5e972e9b868ac62eb57c8cf1a207c59938c5862250b310c3d0d4947f" Mar 13 14:41:19 crc kubenswrapper[4898]: I0313 14:41:19.134754 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:41:19 crc kubenswrapper[4898]: I0313 14:41:19.135559 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:41:49 crc kubenswrapper[4898]: I0313 14:41:49.134713 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:41:49 crc kubenswrapper[4898]: I0313 14:41:49.135426 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:41:51 crc kubenswrapper[4898]: I0313 14:41:51.998717 4898 generic.go:334] "Generic (PLEG): container finished" podID="226c01c4-d0f3-4784-8e93-36d1de6d593f" containerID="8d123c34cb14e74bcadc841aa33ca89cf1efe34ff10b94c6d0b5690f3c9b0353" exitCode=0 Mar 13 14:41:51 crc kubenswrapper[4898]: I0313 14:41:51.998773 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" event={"ID":"226c01c4-d0f3-4784-8e93-36d1de6d593f","Type":"ContainerDied","Data":"8d123c34cb14e74bcadc841aa33ca89cf1efe34ff10b94c6d0b5690f3c9b0353"} Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.623123 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.652841 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-secret-0\") pod \"226c01c4-d0f3-4784-8e93-36d1de6d593f\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.652974 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory\") pod \"226c01c4-d0f3-4784-8e93-36d1de6d593f\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.653146 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-combined-ca-bundle\") pod \"226c01c4-d0f3-4784-8e93-36d1de6d593f\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.653236 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plnfv\" (UniqueName: \"kubernetes.io/projected/226c01c4-d0f3-4784-8e93-36d1de6d593f-kube-api-access-plnfv\") pod \"226c01c4-d0f3-4784-8e93-36d1de6d593f\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.653322 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-ssh-key-openstack-edpm-ipam\") pod \"226c01c4-d0f3-4784-8e93-36d1de6d593f\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.676059 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "226c01c4-d0f3-4784-8e93-36d1de6d593f" (UID: "226c01c4-d0f3-4784-8e93-36d1de6d593f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.687177 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/226c01c4-d0f3-4784-8e93-36d1de6d593f-kube-api-access-plnfv" (OuterVolumeSpecName: "kube-api-access-plnfv") pod "226c01c4-d0f3-4784-8e93-36d1de6d593f" (UID: "226c01c4-d0f3-4784-8e93-36d1de6d593f"). InnerVolumeSpecName "kube-api-access-plnfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.731763 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "226c01c4-d0f3-4784-8e93-36d1de6d593f" (UID: "226c01c4-d0f3-4784-8e93-36d1de6d593f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:41:53 crc kubenswrapper[4898]: E0313 14:41:53.737361 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory podName:226c01c4-d0f3-4784-8e93-36d1de6d593f nodeName:}" failed. No retries permitted until 2026-03-13 14:41:54.237327174 +0000 UTC m=+2749.238915433 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory") pod "226c01c4-d0f3-4784-8e93-36d1de6d593f" (UID: "226c01c4-d0f3-4784-8e93-36d1de6d593f") : error deleting /var/lib/kubelet/pods/226c01c4-d0f3-4784-8e93-36d1de6d593f/volume-subpaths: remove /var/lib/kubelet/pods/226c01c4-d0f3-4784-8e93-36d1de6d593f/volume-subpaths: no such file or directory Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.741648 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "226c01c4-d0f3-4784-8e93-36d1de6d593f" (UID: "226c01c4-d0f3-4784-8e93-36d1de6d593f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.756391 4898 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.756423 4898 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.756441 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plnfv\" (UniqueName: \"kubernetes.io/projected/226c01c4-d0f3-4784-8e93-36d1de6d593f-kube-api-access-plnfv\") on node \"crc\" DevicePath \"\"" Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.756452 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.026494 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" event={"ID":"226c01c4-d0f3-4784-8e93-36d1de6d593f","Type":"ContainerDied","Data":"c5ed48dbe017708bc1a827b7644836fd3073060755f704b693ca4fab0b30fb9b"} Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.026550 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5ed48dbe017708bc1a827b7644836fd3073060755f704b693ca4fab0b30fb9b" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.026630 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.133141 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg"] Mar 13 14:41:54 crc kubenswrapper[4898]: E0313 14:41:54.133585 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="226c01c4-d0f3-4784-8e93-36d1de6d593f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.133602 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="226c01c4-d0f3-4784-8e93-36d1de6d593f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 14:41:54 crc kubenswrapper[4898]: E0313 14:41:54.133637 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d4e2ed-7457-458a-9c76-dcf8f3aadd99" containerName="oc" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.133645 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d4e2ed-7457-458a-9c76-dcf8f3aadd99" containerName="oc" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.133851 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="23d4e2ed-7457-458a-9c76-dcf8f3aadd99" containerName="oc" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.133866 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="226c01c4-d0f3-4784-8e93-36d1de6d593f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.134628 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.137463 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.137471 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.137468 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.166996 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167155 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167215 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167284 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwz9x\" (UniqueName: \"kubernetes.io/projected/acaa3912-3e27-4272-8e4a-3ab67fd34b92-kube-api-access-kwz9x\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167323 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167367 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167463 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167567 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167607 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167647 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167715 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.182145 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg"] Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.269547 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory\") pod \"226c01c4-d0f3-4784-8e93-36d1de6d593f\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270003 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270058 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270080 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270099 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwz9x\" (UniqueName: \"kubernetes.io/projected/acaa3912-3e27-4272-8e4a-3ab67fd34b92-kube-api-access-kwz9x\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270119 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270136 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270176 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270217 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270233 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270251 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270284 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.271538 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.273314 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.273547 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.273712 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.275051 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.275089 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.275105 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.276264 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.276334 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.278381 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.282950 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory" (OuterVolumeSpecName: "inventory") pod "226c01c4-d0f3-4784-8e93-36d1de6d593f" (UID: "226c01c4-d0f3-4784-8e93-36d1de6d593f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.286756 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwz9x\" (UniqueName: \"kubernetes.io/projected/acaa3912-3e27-4272-8e4a-3ab67fd34b92-kube-api-access-kwz9x\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.372501 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.492660 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:55 crc kubenswrapper[4898]: I0313 14:41:55.059036 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg"] Mar 13 14:41:56 crc kubenswrapper[4898]: I0313 14:41:56.049731 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" event={"ID":"acaa3912-3e27-4272-8e4a-3ab67fd34b92","Type":"ContainerStarted","Data":"70859fbfeb0fb1276f0e4311e36fe620c68c8fac7970aae8138f23a5b9f896be"} Mar 13 14:41:56 crc kubenswrapper[4898]: I0313 14:41:56.050124 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" event={"ID":"acaa3912-3e27-4272-8e4a-3ab67fd34b92","Type":"ContainerStarted","Data":"2be41a54a106745535a45aba88fa9b87d13a15b26a46e0adacc9d9b51e76bede"} Mar 13 14:41:56 crc kubenswrapper[4898]: I0313 14:41:56.086140 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" podStartSLOduration=1.495427 podStartE2EDuration="2.086120322s" podCreationTimestamp="2026-03-13 14:41:54 +0000 UTC" firstStartedPulling="2026-03-13 14:41:55.066346506 +0000 UTC m=+2750.067934755" lastFinishedPulling="2026-03-13 14:41:55.657039838 +0000 UTC m=+2750.658628077" observedRunningTime="2026-03-13 14:41:56.08447156 +0000 UTC m=+2751.086059839" watchObservedRunningTime="2026-03-13 14:41:56.086120322 +0000 UTC m=+2751.087708571" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.141628 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556882-6vdgp"] Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.143774 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556882-6vdgp" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.147877 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.147959 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.147909 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.152021 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556882-6vdgp"] Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.315146 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6ljs\" (UniqueName: \"kubernetes.io/projected/46dff21f-c9aa-443a-b1c7-988721788744-kube-api-access-g6ljs\") pod \"auto-csr-approver-29556882-6vdgp\" (UID: \"46dff21f-c9aa-443a-b1c7-988721788744\") " pod="openshift-infra/auto-csr-approver-29556882-6vdgp" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.417385 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6ljs\" (UniqueName: \"kubernetes.io/projected/46dff21f-c9aa-443a-b1c7-988721788744-kube-api-access-g6ljs\") pod \"auto-csr-approver-29556882-6vdgp\" (UID: \"46dff21f-c9aa-443a-b1c7-988721788744\") " pod="openshift-infra/auto-csr-approver-29556882-6vdgp" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.435522 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6ljs\" (UniqueName: \"kubernetes.io/projected/46dff21f-c9aa-443a-b1c7-988721788744-kube-api-access-g6ljs\") pod \"auto-csr-approver-29556882-6vdgp\" (UID: \"46dff21f-c9aa-443a-b1c7-988721788744\") " pod="openshift-infra/auto-csr-approver-29556882-6vdgp" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.467041 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556882-6vdgp" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.939225 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556882-6vdgp"] Mar 13 14:42:01 crc kubenswrapper[4898]: I0313 14:42:01.099824 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556882-6vdgp" event={"ID":"46dff21f-c9aa-443a-b1c7-988721788744","Type":"ContainerStarted","Data":"226ecfa378c63a7ac9d79f6bc968ebd3e31e06dd97440f7c704521eaec1c25eb"} Mar 13 14:42:03 crc kubenswrapper[4898]: I0313 14:42:03.135924 4898 generic.go:334] "Generic (PLEG): container finished" podID="46dff21f-c9aa-443a-b1c7-988721788744" containerID="5dfdf7dc37e2c03d23dcf11c2bda6721f5e0189a55bee1c413509ed3a8808306" exitCode=0 Mar 13 14:42:03 crc kubenswrapper[4898]: I0313 14:42:03.136310 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556882-6vdgp" event={"ID":"46dff21f-c9aa-443a-b1c7-988721788744","Type":"ContainerDied","Data":"5dfdf7dc37e2c03d23dcf11c2bda6721f5e0189a55bee1c413509ed3a8808306"} Mar 13 14:42:05 crc kubenswrapper[4898]: I0313 14:42:05.610294 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556882-6vdgp" Mar 13 14:42:05 crc kubenswrapper[4898]: I0313 14:42:05.678585 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6ljs\" (UniqueName: \"kubernetes.io/projected/46dff21f-c9aa-443a-b1c7-988721788744-kube-api-access-g6ljs\") pod \"46dff21f-c9aa-443a-b1c7-988721788744\" (UID: \"46dff21f-c9aa-443a-b1c7-988721788744\") " Mar 13 14:42:05 crc kubenswrapper[4898]: I0313 14:42:05.684545 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46dff21f-c9aa-443a-b1c7-988721788744-kube-api-access-g6ljs" (OuterVolumeSpecName: "kube-api-access-g6ljs") pod "46dff21f-c9aa-443a-b1c7-988721788744" (UID: "46dff21f-c9aa-443a-b1c7-988721788744"). InnerVolumeSpecName "kube-api-access-g6ljs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:42:05 crc kubenswrapper[4898]: I0313 14:42:05.781974 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6ljs\" (UniqueName: \"kubernetes.io/projected/46dff21f-c9aa-443a-b1c7-988721788744-kube-api-access-g6ljs\") on node \"crc\" DevicePath \"\"" Mar 13 14:42:06 crc kubenswrapper[4898]: I0313 14:42:06.225085 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556882-6vdgp" event={"ID":"46dff21f-c9aa-443a-b1c7-988721788744","Type":"ContainerDied","Data":"226ecfa378c63a7ac9d79f6bc968ebd3e31e06dd97440f7c704521eaec1c25eb"} Mar 13 14:42:06 crc kubenswrapper[4898]: I0313 14:42:06.225146 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="226ecfa378c63a7ac9d79f6bc968ebd3e31e06dd97440f7c704521eaec1c25eb" Mar 13 14:42:06 crc kubenswrapper[4898]: I0313 14:42:06.225221 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556882-6vdgp" Mar 13 14:42:06 crc kubenswrapper[4898]: I0313 14:42:06.691225 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556876-j2sld"] Mar 13 14:42:06 crc kubenswrapper[4898]: I0313 14:42:06.701637 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556876-j2sld"] Mar 13 14:42:07 crc kubenswrapper[4898]: I0313 14:42:07.760985 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e83b21e9-13bc-4f80-a228-126fbc98c8f6" path="/var/lib/kubelet/pods/e83b21e9-13bc-4f80-a228-126fbc98c8f6/volumes" Mar 13 14:42:08 crc kubenswrapper[4898]: E0313 14:42:08.906334 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:13 crc kubenswrapper[4898]: E0313 14:42:13.604833 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:19 crc kubenswrapper[4898]: I0313 14:42:19.134623 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:42:19 crc kubenswrapper[4898]: I0313 14:42:19.134963 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:42:19 crc kubenswrapper[4898]: I0313 14:42:19.135019 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:42:19 crc kubenswrapper[4898]: I0313 14:42:19.136043 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"610faaa089f666c18409ed2be816ac54810449ed44b96f98a2303a40ac9c9836"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:42:19 crc kubenswrapper[4898]: I0313 14:42:19.136203 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://610faaa089f666c18409ed2be816ac54810449ed44b96f98a2303a40ac9c9836" gracePeriod=600 Mar 13 14:42:19 crc kubenswrapper[4898]: E0313 14:42:19.167025 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod767eecef_3bc9_4db4_a0cb_5d9c8554c62d.slice/crio-610faaa089f666c18409ed2be816ac54810449ed44b96f98a2303a40ac9c9836.scope\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:19 crc kubenswrapper[4898]: I0313 14:42:19.396674 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="610faaa089f666c18409ed2be816ac54810449ed44b96f98a2303a40ac9c9836" exitCode=0 Mar 13 14:42:19 crc kubenswrapper[4898]: I0313 14:42:19.396923 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"610faaa089f666c18409ed2be816ac54810449ed44b96f98a2303a40ac9c9836"} Mar 13 14:42:19 crc kubenswrapper[4898]: I0313 14:42:19.397249 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:42:20 crc kubenswrapper[4898]: I0313 14:42:20.408984 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5"} Mar 13 14:42:28 crc kubenswrapper[4898]: E0313 14:42:28.847246 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:29 crc kubenswrapper[4898]: E0313 14:42:29.217259 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:39 crc kubenswrapper[4898]: E0313 14:42:39.581959 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:41 crc kubenswrapper[4898]: I0313 14:42:41.119850 4898 scope.go:117] "RemoveContainer" containerID="880ec0d7753626dc3ced87b5a1086a85612fac87d9f534c6f11457452f7a1041" Mar 13 14:42:43 crc kubenswrapper[4898]: E0313 14:42:43.602577 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:48 crc kubenswrapper[4898]: E0313 14:42:48.254730 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:48 crc kubenswrapper[4898]: E0313 14:42:48.254734 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:49 crc kubenswrapper[4898]: E0313 14:42:49.628484 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:58 crc kubenswrapper[4898]: E0313 14:42:58.946117 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:59 crc kubenswrapper[4898]: E0313 14:42:59.687456 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.193595 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fcfmz"] Mar 13 14:43:50 crc kubenswrapper[4898]: E0313 14:43:50.194603 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46dff21f-c9aa-443a-b1c7-988721788744" containerName="oc" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.194617 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="46dff21f-c9aa-443a-b1c7-988721788744" containerName="oc" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.194958 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="46dff21f-c9aa-443a-b1c7-988721788744" containerName="oc" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.197039 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.211013 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcfmz"] Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.280750 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-utilities\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.281238 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4szbp\" (UniqueName: \"kubernetes.io/projected/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-kube-api-access-4szbp\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.281533 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-catalog-content\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.383297 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-catalog-content\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.383390 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-utilities\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.383890 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-utilities\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.383927 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-catalog-content\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.384801 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4szbp\" (UniqueName: \"kubernetes.io/projected/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-kube-api-access-4szbp\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.416966 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4szbp\" (UniqueName: \"kubernetes.io/projected/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-kube-api-access-4szbp\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.531020 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:51 crc kubenswrapper[4898]: I0313 14:43:51.121603 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcfmz"] Mar 13 14:43:51 crc kubenswrapper[4898]: I0313 14:43:51.952301 4898 generic.go:334] "Generic (PLEG): container finished" podID="2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e" containerID="de57d30bfb39ff3d400ef8caa8ccece22bc11c6e01ef51fff685669ac60fb3cb" exitCode=0 Mar 13 14:43:51 crc kubenswrapper[4898]: I0313 14:43:51.952441 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcfmz" event={"ID":"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e","Type":"ContainerDied","Data":"de57d30bfb39ff3d400ef8caa8ccece22bc11c6e01ef51fff685669ac60fb3cb"} Mar 13 14:43:51 crc kubenswrapper[4898]: I0313 14:43:51.953431 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcfmz" event={"ID":"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e","Type":"ContainerStarted","Data":"bba0555b7b58e4030810787bca615b1597cf609758e6d666bfe9aeec2f91ba59"} Mar 13 14:43:51 crc kubenswrapper[4898]: I0313 14:43:51.954719 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:43:57 crc kubenswrapper[4898]: I0313 14:43:57.018260 4898 generic.go:334] "Generic (PLEG): container finished" podID="2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e" containerID="ff8823ec8283c190f689d06300aff1538109e7ec55797335db2fe37e571d0171" exitCode=0 Mar 13 14:43:57 crc kubenswrapper[4898]: I0313 14:43:57.018336 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcfmz" event={"ID":"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e","Type":"ContainerDied","Data":"ff8823ec8283c190f689d06300aff1538109e7ec55797335db2fe37e571d0171"} Mar 13 14:43:59 crc kubenswrapper[4898]: I0313 14:43:59.048407 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcfmz" event={"ID":"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e","Type":"ContainerStarted","Data":"1d82c05cde25f5d47134bce4e0b97b775c14d95d3b7690a9f75b30eaf4f13545"} Mar 13 14:43:59 crc kubenswrapper[4898]: I0313 14:43:59.099091 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fcfmz" podStartSLOduration=3.503307437 podStartE2EDuration="9.099050494s" podCreationTimestamp="2026-03-13 14:43:50 +0000 UTC" firstStartedPulling="2026-03-13 14:43:51.954528748 +0000 UTC m=+2866.956116987" lastFinishedPulling="2026-03-13 14:43:57.550271795 +0000 UTC m=+2872.551860044" observedRunningTime="2026-03-13 14:43:59.075149712 +0000 UTC m=+2874.076738001" watchObservedRunningTime="2026-03-13 14:43:59.099050494 +0000 UTC m=+2874.100638813" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.158698 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556884-wvc75"] Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.160521 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556884-wvc75" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.164210 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.164550 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.165439 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.206787 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556884-wvc75"] Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.256869 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szgkr\" (UniqueName: \"kubernetes.io/projected/22d70d9e-a058-43a7-b692-19cd302d65ca-kube-api-access-szgkr\") pod \"auto-csr-approver-29556884-wvc75\" (UID: \"22d70d9e-a058-43a7-b692-19cd302d65ca\") " pod="openshift-infra/auto-csr-approver-29556884-wvc75" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.359009 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szgkr\" (UniqueName: \"kubernetes.io/projected/22d70d9e-a058-43a7-b692-19cd302d65ca-kube-api-access-szgkr\") pod \"auto-csr-approver-29556884-wvc75\" (UID: \"22d70d9e-a058-43a7-b692-19cd302d65ca\") " pod="openshift-infra/auto-csr-approver-29556884-wvc75" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.381866 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szgkr\" (UniqueName: \"kubernetes.io/projected/22d70d9e-a058-43a7-b692-19cd302d65ca-kube-api-access-szgkr\") pod \"auto-csr-approver-29556884-wvc75\" (UID: \"22d70d9e-a058-43a7-b692-19cd302d65ca\") " pod="openshift-infra/auto-csr-approver-29556884-wvc75" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.497374 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556884-wvc75" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.531573 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.538129 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:44:01 crc kubenswrapper[4898]: I0313 14:44:01.029014 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556884-wvc75"] Mar 13 14:44:01 crc kubenswrapper[4898]: I0313 14:44:01.070387 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556884-wvc75" event={"ID":"22d70d9e-a058-43a7-b692-19cd302d65ca","Type":"ContainerStarted","Data":"ba1e66524699b6445f436669f57528c41766bd573cf4969fd64f6902f9aa0242"} Mar 13 14:44:01 crc kubenswrapper[4898]: I0313 14:44:01.600294 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fcfmz" podUID="2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e" containerName="registry-server" probeResult="failure" output=< Mar 13 14:44:01 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:44:01 crc kubenswrapper[4898]: > Mar 13 14:44:04 crc kubenswrapper[4898]: I0313 14:44:04.112047 4898 generic.go:334] "Generic (PLEG): container finished" podID="22d70d9e-a058-43a7-b692-19cd302d65ca" containerID="668ec92e00de6e908be7a4b238e021ba041b8ee50f571d510eea90e125398f41" exitCode=0 Mar 13 14:44:04 crc kubenswrapper[4898]: I0313 14:44:04.112276 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556884-wvc75" event={"ID":"22d70d9e-a058-43a7-b692-19cd302d65ca","Type":"ContainerDied","Data":"668ec92e00de6e908be7a4b238e021ba041b8ee50f571d510eea90e125398f41"} Mar 13 14:44:05 crc kubenswrapper[4898]: I0313 14:44:05.857128 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556884-wvc75" Mar 13 14:44:06 crc kubenswrapper[4898]: I0313 14:44:06.013574 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szgkr\" (UniqueName: \"kubernetes.io/projected/22d70d9e-a058-43a7-b692-19cd302d65ca-kube-api-access-szgkr\") pod \"22d70d9e-a058-43a7-b692-19cd302d65ca\" (UID: \"22d70d9e-a058-43a7-b692-19cd302d65ca\") " Mar 13 14:44:06 crc kubenswrapper[4898]: I0313 14:44:06.020429 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22d70d9e-a058-43a7-b692-19cd302d65ca-kube-api-access-szgkr" (OuterVolumeSpecName: "kube-api-access-szgkr") pod "22d70d9e-a058-43a7-b692-19cd302d65ca" (UID: "22d70d9e-a058-43a7-b692-19cd302d65ca"). InnerVolumeSpecName "kube-api-access-szgkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:44:06 crc kubenswrapper[4898]: I0313 14:44:06.116467 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szgkr\" (UniqueName: \"kubernetes.io/projected/22d70d9e-a058-43a7-b692-19cd302d65ca-kube-api-access-szgkr\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:06 crc kubenswrapper[4898]: I0313 14:44:06.140505 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556884-wvc75" event={"ID":"22d70d9e-a058-43a7-b692-19cd302d65ca","Type":"ContainerDied","Data":"ba1e66524699b6445f436669f57528c41766bd573cf4969fd64f6902f9aa0242"} Mar 13 14:44:06 crc kubenswrapper[4898]: I0313 14:44:06.140796 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba1e66524699b6445f436669f57528c41766bd573cf4969fd64f6902f9aa0242" Mar 13 14:44:06 crc kubenswrapper[4898]: I0313 14:44:06.140566 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556884-wvc75" Mar 13 14:44:06 crc kubenswrapper[4898]: I0313 14:44:06.955615 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556878-btqw2"] Mar 13 14:44:06 crc kubenswrapper[4898]: I0313 14:44:06.979419 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556878-btqw2"] Mar 13 14:44:07 crc kubenswrapper[4898]: I0313 14:44:07.756238 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39249464-ab82-4938-978e-2ffcbc637f4f" path="/var/lib/kubelet/pods/39249464-ab82-4938-978e-2ffcbc637f4f/volumes" Mar 13 14:44:10 crc kubenswrapper[4898]: I0313 14:44:10.598030 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:44:10 crc kubenswrapper[4898]: I0313 14:44:10.684556 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:44:10 crc kubenswrapper[4898]: I0313 14:44:10.798646 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcfmz"] Mar 13 14:44:10 crc kubenswrapper[4898]: I0313 14:44:10.847299 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nf9mj"] Mar 13 14:44:10 crc kubenswrapper[4898]: I0313 14:44:10.847522 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nf9mj" podUID="112ac477-caf1-4778-9161-737e393633b6" containerName="registry-server" containerID="cri-o://fc60b3fabeb85294acc6203495a9c722283813bdd45f914886621b87378fb993" gracePeriod=2 Mar 13 14:44:11 crc kubenswrapper[4898]: E0313 14:44:11.187417 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod112ac477_caf1_4778_9161_737e393633b6.slice/crio-conmon-fc60b3fabeb85294acc6203495a9c722283813bdd45f914886621b87378fb993.scope\": RecentStats: unable to find data in memory cache]" Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.227606 4898 generic.go:334] "Generic (PLEG): container finished" podID="112ac477-caf1-4778-9161-737e393633b6" containerID="fc60b3fabeb85294acc6203495a9c722283813bdd45f914886621b87378fb993" exitCode=0 Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.227698 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nf9mj" event={"ID":"112ac477-caf1-4778-9161-737e393633b6","Type":"ContainerDied","Data":"fc60b3fabeb85294acc6203495a9c722283813bdd45f914886621b87378fb993"} Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.390764 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.496393 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-utilities\") pod \"112ac477-caf1-4778-9161-737e393633b6\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.496545 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z75w9\" (UniqueName: \"kubernetes.io/projected/112ac477-caf1-4778-9161-737e393633b6-kube-api-access-z75w9\") pod \"112ac477-caf1-4778-9161-737e393633b6\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.496581 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-catalog-content\") pod \"112ac477-caf1-4778-9161-737e393633b6\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.497498 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-utilities" (OuterVolumeSpecName: "utilities") pod "112ac477-caf1-4778-9161-737e393633b6" (UID: "112ac477-caf1-4778-9161-737e393633b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.510776 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/112ac477-caf1-4778-9161-737e393633b6-kube-api-access-z75w9" (OuterVolumeSpecName: "kube-api-access-z75w9") pod "112ac477-caf1-4778-9161-737e393633b6" (UID: "112ac477-caf1-4778-9161-737e393633b6"). InnerVolumeSpecName "kube-api-access-z75w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.599369 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "112ac477-caf1-4778-9161-737e393633b6" (UID: "112ac477-caf1-4778-9161-737e393633b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.601146 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.601164 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.601173 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z75w9\" (UniqueName: \"kubernetes.io/projected/112ac477-caf1-4778-9161-737e393633b6-kube-api-access-z75w9\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:12 crc kubenswrapper[4898]: I0313 14:44:12.240864 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nf9mj" event={"ID":"112ac477-caf1-4778-9161-737e393633b6","Type":"ContainerDied","Data":"cd6fb5a64e7e71f880fc652eb99feafe0af0462b20f85dff2e92b55a99558f6f"} Mar 13 14:44:12 crc kubenswrapper[4898]: I0313 14:44:12.240907 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:44:12 crc kubenswrapper[4898]: I0313 14:44:12.241186 4898 scope.go:117] "RemoveContainer" containerID="fc60b3fabeb85294acc6203495a9c722283813bdd45f914886621b87378fb993" Mar 13 14:44:12 crc kubenswrapper[4898]: I0313 14:44:12.272637 4898 scope.go:117] "RemoveContainer" containerID="2e43cc9675fd8cbd58a911f4205da76807b5b902a7fc3bd0c4cb735298e250c5" Mar 13 14:44:12 crc kubenswrapper[4898]: I0313 14:44:12.275546 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nf9mj"] Mar 13 14:44:12 crc kubenswrapper[4898]: I0313 14:44:12.290242 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nf9mj"] Mar 13 14:44:12 crc kubenswrapper[4898]: I0313 14:44:12.303117 4898 scope.go:117] "RemoveContainer" containerID="2316a3e4d2fc964fff3bdea961936abc15de57cb9446d0c3ca366fa8840b5460" Mar 13 14:44:13 crc kubenswrapper[4898]: I0313 14:44:13.758996 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="112ac477-caf1-4778-9161-737e393633b6" path="/var/lib/kubelet/pods/112ac477-caf1-4778-9161-737e393633b6/volumes" Mar 13 14:44:19 crc kubenswrapper[4898]: I0313 14:44:19.135044 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:44:19 crc kubenswrapper[4898]: I0313 14:44:19.135961 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.102214 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jpqqv"] Mar 13 14:44:22 crc kubenswrapper[4898]: E0313 14:44:22.103694 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112ac477-caf1-4778-9161-737e393633b6" containerName="registry-server" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.103716 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="112ac477-caf1-4778-9161-737e393633b6" containerName="registry-server" Mar 13 14:44:22 crc kubenswrapper[4898]: E0313 14:44:22.103740 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112ac477-caf1-4778-9161-737e393633b6" containerName="extract-content" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.103751 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="112ac477-caf1-4778-9161-737e393633b6" containerName="extract-content" Mar 13 14:44:22 crc kubenswrapper[4898]: E0313 14:44:22.103779 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22d70d9e-a058-43a7-b692-19cd302d65ca" containerName="oc" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.103792 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d70d9e-a058-43a7-b692-19cd302d65ca" containerName="oc" Mar 13 14:44:22 crc kubenswrapper[4898]: E0313 14:44:22.103820 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112ac477-caf1-4778-9161-737e393633b6" containerName="extract-utilities" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.103835 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="112ac477-caf1-4778-9161-737e393633b6" containerName="extract-utilities" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.104288 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="22d70d9e-a058-43a7-b692-19cd302d65ca" containerName="oc" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.104308 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="112ac477-caf1-4778-9161-737e393633b6" containerName="registry-server" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.107236 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.124426 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jpqqv"] Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.181951 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-catalog-content\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.182120 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-utilities\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.182193 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnxp7\" (UniqueName: \"kubernetes.io/projected/ff59b74b-017c-4c31-8171-8e2f6ee07a75-kube-api-access-pnxp7\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.303318 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-catalog-content\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.303563 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-utilities\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.303678 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnxp7\" (UniqueName: \"kubernetes.io/projected/ff59b74b-017c-4c31-8171-8e2f6ee07a75-kube-api-access-pnxp7\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.303992 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-catalog-content\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.304124 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-utilities\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.330369 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnxp7\" (UniqueName: \"kubernetes.io/projected/ff59b74b-017c-4c31-8171-8e2f6ee07a75-kube-api-access-pnxp7\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.434374 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.991676 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jpqqv"] Mar 13 14:44:23 crc kubenswrapper[4898]: I0313 14:44:23.383973 4898 generic.go:334] "Generic (PLEG): container finished" podID="acaa3912-3e27-4272-8e4a-3ab67fd34b92" containerID="70859fbfeb0fb1276f0e4311e36fe620c68c8fac7970aae8138f23a5b9f896be" exitCode=0 Mar 13 14:44:23 crc kubenswrapper[4898]: I0313 14:44:23.384040 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" event={"ID":"acaa3912-3e27-4272-8e4a-3ab67fd34b92","Type":"ContainerDied","Data":"70859fbfeb0fb1276f0e4311e36fe620c68c8fac7970aae8138f23a5b9f896be"} Mar 13 14:44:23 crc kubenswrapper[4898]: I0313 14:44:23.385789 4898 generic.go:334] "Generic (PLEG): container finished" podID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerID="49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec" exitCode=0 Mar 13 14:44:23 crc kubenswrapper[4898]: I0313 14:44:23.385840 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpqqv" event={"ID":"ff59b74b-017c-4c31-8171-8e2f6ee07a75","Type":"ContainerDied","Data":"49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec"} Mar 13 14:44:23 crc kubenswrapper[4898]: I0313 14:44:23.385890 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpqqv" event={"ID":"ff59b74b-017c-4c31-8171-8e2f6ee07a75","Type":"ContainerStarted","Data":"934bdccbb64d24f7c4d0c481aec18c3675c14a5604b4ef1370a465435fcca680"} Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.401940 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpqqv" event={"ID":"ff59b74b-017c-4c31-8171-8e2f6ee07a75","Type":"ContainerStarted","Data":"33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e"} Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.927654 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.991835 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwz9x\" (UniqueName: \"kubernetes.io/projected/acaa3912-3e27-4272-8e4a-3ab67fd34b92-kube-api-access-kwz9x\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.992808 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-2\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993092 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-3\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993128 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-combined-ca-bundle\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993262 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-1\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993299 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-0\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993324 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-extra-config-0\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993457 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-0\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993520 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-inventory\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993549 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-1\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993612 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-ssh-key-openstack-edpm-ipam\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.999829 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.000460 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acaa3912-3e27-4272-8e4a-3ab67fd34b92-kube-api-access-kwz9x" (OuterVolumeSpecName: "kube-api-access-kwz9x") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "kube-api-access-kwz9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.030797 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-inventory" (OuterVolumeSpecName: "inventory") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.041746 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.043516 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.054614 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.056417 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.063986 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.067435 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.069178 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.075217 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100715 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwz9x\" (UniqueName: \"kubernetes.io/projected/acaa3912-3e27-4272-8e4a-3ab67fd34b92-kube-api-access-kwz9x\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100746 4898 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100757 4898 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100766 4898 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100775 4898 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100784 4898 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100792 4898 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100802 4898 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100810 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100819 4898 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100827 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.412190 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" event={"ID":"acaa3912-3e27-4272-8e4a-3ab67fd34b92","Type":"ContainerDied","Data":"2be41a54a106745535a45aba88fa9b87d13a15b26a46e0adacc9d9b51e76bede"} Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.412243 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2be41a54a106745535a45aba88fa9b87d13a15b26a46e0adacc9d9b51e76bede" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.412404 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.529941 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk"] Mar 13 14:44:25 crc kubenswrapper[4898]: E0313 14:44:25.530417 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acaa3912-3e27-4272-8e4a-3ab67fd34b92" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.530435 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="acaa3912-3e27-4272-8e4a-3ab67fd34b92" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.530681 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="acaa3912-3e27-4272-8e4a-3ab67fd34b92" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.531532 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.535715 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.535846 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.537096 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.538400 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.538420 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.555291 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk"] Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.611061 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.611109 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.611130 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.611647 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.611967 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmwct\" (UniqueName: \"kubernetes.io/projected/9a62fd58-a586-4473-abfe-4e227cad9900-kube-api-access-vmwct\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.612074 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.612120 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.714780 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.714932 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmwct\" (UniqueName: \"kubernetes.io/projected/9a62fd58-a586-4473-abfe-4e227cad9900-kube-api-access-vmwct\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.715038 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.715104 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.715259 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.715312 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.715346 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.719420 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.719461 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.719636 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.720140 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.720533 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.722781 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.733640 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmwct\" (UniqueName: \"kubernetes.io/projected/9a62fd58-a586-4473-abfe-4e227cad9900-kube-api-access-vmwct\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.853649 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:26 crc kubenswrapper[4898]: I0313 14:44:26.417854 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk"] Mar 13 14:44:27 crc kubenswrapper[4898]: I0313 14:44:27.450233 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" event={"ID":"9a62fd58-a586-4473-abfe-4e227cad9900","Type":"ContainerStarted","Data":"eb475c12bddce2c97f158483e36ec6344049764d71c6a03112569a01e066938c"} Mar 13 14:44:28 crc kubenswrapper[4898]: I0313 14:44:28.473033 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" event={"ID":"9a62fd58-a586-4473-abfe-4e227cad9900","Type":"ContainerStarted","Data":"b68526abfbbd65229d0fa636d4985f81e76151523259cd0cea1b6513d33ed080"} Mar 13 14:44:28 crc kubenswrapper[4898]: I0313 14:44:28.513447 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" podStartSLOduration=2.838635402 podStartE2EDuration="3.513420596s" podCreationTimestamp="2026-03-13 14:44:25 +0000 UTC" firstStartedPulling="2026-03-13 14:44:26.426361908 +0000 UTC m=+2901.427950157" lastFinishedPulling="2026-03-13 14:44:27.101147092 +0000 UTC m=+2902.102735351" observedRunningTime="2026-03-13 14:44:28.507686359 +0000 UTC m=+2903.509274618" watchObservedRunningTime="2026-03-13 14:44:28.513420596 +0000 UTC m=+2903.515008845" Mar 13 14:44:29 crc kubenswrapper[4898]: I0313 14:44:29.498665 4898 generic.go:334] "Generic (PLEG): container finished" podID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerID="33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e" exitCode=0 Mar 13 14:44:29 crc kubenswrapper[4898]: I0313 14:44:29.498734 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpqqv" event={"ID":"ff59b74b-017c-4c31-8171-8e2f6ee07a75","Type":"ContainerDied","Data":"33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e"} Mar 13 14:44:30 crc kubenswrapper[4898]: I0313 14:44:30.517032 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpqqv" event={"ID":"ff59b74b-017c-4c31-8171-8e2f6ee07a75","Type":"ContainerStarted","Data":"5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af"} Mar 13 14:44:31 crc kubenswrapper[4898]: I0313 14:44:31.574359 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jpqqv" podStartSLOduration=3.02834831 podStartE2EDuration="9.574328953s" podCreationTimestamp="2026-03-13 14:44:22 +0000 UTC" firstStartedPulling="2026-03-13 14:44:23.387948016 +0000 UTC m=+2898.389536265" lastFinishedPulling="2026-03-13 14:44:29.933928649 +0000 UTC m=+2904.935516908" observedRunningTime="2026-03-13 14:44:31.558506678 +0000 UTC m=+2906.560094937" watchObservedRunningTime="2026-03-13 14:44:31.574328953 +0000 UTC m=+2906.575917262" Mar 13 14:44:32 crc kubenswrapper[4898]: I0313 14:44:32.436110 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:32 crc kubenswrapper[4898]: I0313 14:44:32.436395 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:33 crc kubenswrapper[4898]: I0313 14:44:33.490862 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jpqqv" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="registry-server" probeResult="failure" output=< Mar 13 14:44:33 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:44:33 crc kubenswrapper[4898]: > Mar 13 14:44:41 crc kubenswrapper[4898]: I0313 14:44:41.263836 4898 scope.go:117] "RemoveContainer" containerID="9b860152e27ff03f1039fc3f0a1f9cf3aa08903cd38847b8eba8da6dc52d2b6e" Mar 13 14:44:42 crc kubenswrapper[4898]: I0313 14:44:42.535729 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:42 crc kubenswrapper[4898]: I0313 14:44:42.632506 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:42 crc kubenswrapper[4898]: I0313 14:44:42.787391 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jpqqv"] Mar 13 14:44:43 crc kubenswrapper[4898]: I0313 14:44:43.685405 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jpqqv" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="registry-server" containerID="cri-o://5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af" gracePeriod=2 Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.267966 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.376456 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnxp7\" (UniqueName: \"kubernetes.io/projected/ff59b74b-017c-4c31-8171-8e2f6ee07a75-kube-api-access-pnxp7\") pod \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.376577 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-catalog-content\") pod \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.376621 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-utilities\") pod \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.377735 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-utilities" (OuterVolumeSpecName: "utilities") pod "ff59b74b-017c-4c31-8171-8e2f6ee07a75" (UID: "ff59b74b-017c-4c31-8171-8e2f6ee07a75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.394155 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff59b74b-017c-4c31-8171-8e2f6ee07a75-kube-api-access-pnxp7" (OuterVolumeSpecName: "kube-api-access-pnxp7") pod "ff59b74b-017c-4c31-8171-8e2f6ee07a75" (UID: "ff59b74b-017c-4c31-8171-8e2f6ee07a75"). InnerVolumeSpecName "kube-api-access-pnxp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.479887 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.479941 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnxp7\" (UniqueName: \"kubernetes.io/projected/ff59b74b-017c-4c31-8171-8e2f6ee07a75-kube-api-access-pnxp7\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.548743 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff59b74b-017c-4c31-8171-8e2f6ee07a75" (UID: "ff59b74b-017c-4c31-8171-8e2f6ee07a75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.582165 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.700830 4898 generic.go:334] "Generic (PLEG): container finished" podID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerID="5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af" exitCode=0 Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.700878 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpqqv" event={"ID":"ff59b74b-017c-4c31-8171-8e2f6ee07a75","Type":"ContainerDied","Data":"5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af"} Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.700997 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpqqv" event={"ID":"ff59b74b-017c-4c31-8171-8e2f6ee07a75","Type":"ContainerDied","Data":"934bdccbb64d24f7c4d0c481aec18c3675c14a5604b4ef1370a465435fcca680"} Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.701031 4898 scope.go:117] "RemoveContainer" containerID="5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.701031 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.734972 4898 scope.go:117] "RemoveContainer" containerID="33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.763277 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jpqqv"] Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.771699 4898 scope.go:117] "RemoveContainer" containerID="49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.772092 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jpqqv"] Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.859733 4898 scope.go:117] "RemoveContainer" containerID="5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af" Mar 13 14:44:44 crc kubenswrapper[4898]: E0313 14:44:44.860345 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af\": container with ID starting with 5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af not found: ID does not exist" containerID="5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.860446 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af"} err="failed to get container status \"5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af\": rpc error: code = NotFound desc = could not find container \"5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af\": container with ID starting with 5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af not found: ID does not exist" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.860522 4898 scope.go:117] "RemoveContainer" containerID="33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e" Mar 13 14:44:44 crc kubenswrapper[4898]: E0313 14:44:44.861202 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e\": container with ID starting with 33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e not found: ID does not exist" containerID="33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.861295 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e"} err="failed to get container status \"33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e\": rpc error: code = NotFound desc = could not find container \"33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e\": container with ID starting with 33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e not found: ID does not exist" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.861354 4898 scope.go:117] "RemoveContainer" containerID="49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec" Mar 13 14:44:44 crc kubenswrapper[4898]: E0313 14:44:44.861828 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec\": container with ID starting with 49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec not found: ID does not exist" containerID="49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.861858 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec"} err="failed to get container status \"49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec\": rpc error: code = NotFound desc = could not find container \"49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec\": container with ID starting with 49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec not found: ID does not exist" Mar 13 14:44:45 crc kubenswrapper[4898]: I0313 14:44:45.760646 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" path="/var/lib/kubelet/pods/ff59b74b-017c-4c31-8171-8e2f6ee07a75/volumes" Mar 13 14:44:49 crc kubenswrapper[4898]: I0313 14:44:49.134582 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:44:49 crc kubenswrapper[4898]: I0313 14:44:49.135404 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:44:51 crc kubenswrapper[4898]: I0313 14:44:51.946384 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ncdbz"] Mar 13 14:44:51 crc kubenswrapper[4898]: E0313 14:44:51.948092 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="registry-server" Mar 13 14:44:51 crc kubenswrapper[4898]: I0313 14:44:51.948120 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="registry-server" Mar 13 14:44:51 crc kubenswrapper[4898]: E0313 14:44:51.948183 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="extract-content" Mar 13 14:44:51 crc kubenswrapper[4898]: I0313 14:44:51.948196 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="extract-content" Mar 13 14:44:51 crc kubenswrapper[4898]: E0313 14:44:51.948236 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="extract-utilities" Mar 13 14:44:51 crc kubenswrapper[4898]: I0313 14:44:51.948252 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="extract-utilities" Mar 13 14:44:51 crc kubenswrapper[4898]: I0313 14:44:51.948680 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="registry-server" Mar 13 14:44:51 crc kubenswrapper[4898]: I0313 14:44:51.952045 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:51 crc kubenswrapper[4898]: I0313 14:44:51.965893 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncdbz"] Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.007537 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4f5x\" (UniqueName: \"kubernetes.io/projected/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-kube-api-access-q4f5x\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.007817 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-utilities\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.008296 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-catalog-content\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.110925 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-catalog-content\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.111056 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4f5x\" (UniqueName: \"kubernetes.io/projected/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-kube-api-access-q4f5x\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.111524 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-utilities\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.111604 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-catalog-content\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.111928 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-utilities\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.140165 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4f5x\" (UniqueName: \"kubernetes.io/projected/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-kube-api-access-q4f5x\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.286317 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: W0313 14:44:52.818102 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a91e6fb_a1ff_4dec_a854_024ff312a9b6.slice/crio-4affc25dada5bfb7785755f3b2b81aead858ebbb458092ae3682efd511112a6e WatchSource:0}: Error finding container 4affc25dada5bfb7785755f3b2b81aead858ebbb458092ae3682efd511112a6e: Status 404 returned error can't find the container with id 4affc25dada5bfb7785755f3b2b81aead858ebbb458092ae3682efd511112a6e Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.821518 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncdbz"] Mar 13 14:44:53 crc kubenswrapper[4898]: I0313 14:44:53.846063 4898 generic.go:334] "Generic (PLEG): container finished" podID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerID="9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79" exitCode=0 Mar 13 14:44:53 crc kubenswrapper[4898]: I0313 14:44:53.846265 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncdbz" event={"ID":"3a91e6fb-a1ff-4dec-a854-024ff312a9b6","Type":"ContainerDied","Data":"9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79"} Mar 13 14:44:53 crc kubenswrapper[4898]: I0313 14:44:53.846567 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncdbz" event={"ID":"3a91e6fb-a1ff-4dec-a854-024ff312a9b6","Type":"ContainerStarted","Data":"4affc25dada5bfb7785755f3b2b81aead858ebbb458092ae3682efd511112a6e"} Mar 13 14:44:54 crc kubenswrapper[4898]: I0313 14:44:54.857979 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncdbz" event={"ID":"3a91e6fb-a1ff-4dec-a854-024ff312a9b6","Type":"ContainerStarted","Data":"dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4"} Mar 13 14:44:55 crc kubenswrapper[4898]: I0313 14:44:55.874025 4898 generic.go:334] "Generic (PLEG): container finished" podID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerID="dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4" exitCode=0 Mar 13 14:44:55 crc kubenswrapper[4898]: I0313 14:44:55.874054 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncdbz" event={"ID":"3a91e6fb-a1ff-4dec-a854-024ff312a9b6","Type":"ContainerDied","Data":"dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4"} Mar 13 14:44:56 crc kubenswrapper[4898]: I0313 14:44:56.885711 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncdbz" event={"ID":"3a91e6fb-a1ff-4dec-a854-024ff312a9b6","Type":"ContainerStarted","Data":"f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a"} Mar 13 14:44:56 crc kubenswrapper[4898]: I0313 14:44:56.917489 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ncdbz" podStartSLOduration=3.4543903990000002 podStartE2EDuration="5.917471683s" podCreationTimestamp="2026-03-13 14:44:51 +0000 UTC" firstStartedPulling="2026-03-13 14:44:53.850697884 +0000 UTC m=+2928.852286133" lastFinishedPulling="2026-03-13 14:44:56.313779168 +0000 UTC m=+2931.315367417" observedRunningTime="2026-03-13 14:44:56.911080849 +0000 UTC m=+2931.912669098" watchObservedRunningTime="2026-03-13 14:44:56.917471683 +0000 UTC m=+2931.919059922" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.178338 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6"] Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.181484 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.185326 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.185884 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.215441 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6"] Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.325141 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1711d9ce-262c-4c6c-930a-4148e62fae9e-config-volume\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.325496 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1711d9ce-262c-4c6c-930a-4148e62fae9e-secret-volume\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.325722 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm77c\" (UniqueName: \"kubernetes.io/projected/1711d9ce-262c-4c6c-930a-4148e62fae9e-kube-api-access-tm77c\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.427889 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1711d9ce-262c-4c6c-930a-4148e62fae9e-secret-volume\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.428066 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm77c\" (UniqueName: \"kubernetes.io/projected/1711d9ce-262c-4c6c-930a-4148e62fae9e-kube-api-access-tm77c\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.428240 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1711d9ce-262c-4c6c-930a-4148e62fae9e-config-volume\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.429124 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1711d9ce-262c-4c6c-930a-4148e62fae9e-config-volume\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.437782 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1711d9ce-262c-4c6c-930a-4148e62fae9e-secret-volume\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.452644 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm77c\" (UniqueName: \"kubernetes.io/projected/1711d9ce-262c-4c6c-930a-4148e62fae9e-kube-api-access-tm77c\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.512793 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:01 crc kubenswrapper[4898]: I0313 14:45:01.019841 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6"] Mar 13 14:45:01 crc kubenswrapper[4898]: W0313 14:45:01.025793 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1711d9ce_262c_4c6c_930a_4148e62fae9e.slice/crio-62729d26c2d8567b0f26cb53313f6a0403258a8124e902c3a511817667c898c7 WatchSource:0}: Error finding container 62729d26c2d8567b0f26cb53313f6a0403258a8124e902c3a511817667c898c7: Status 404 returned error can't find the container with id 62729d26c2d8567b0f26cb53313f6a0403258a8124e902c3a511817667c898c7 Mar 13 14:45:01 crc kubenswrapper[4898]: I0313 14:45:01.955361 4898 generic.go:334] "Generic (PLEG): container finished" podID="1711d9ce-262c-4c6c-930a-4148e62fae9e" containerID="586dd830bc412bf8d165f328ec0120d6ccafcd1b6e8c6a0642a7f4464c15681b" exitCode=0 Mar 13 14:45:01 crc kubenswrapper[4898]: I0313 14:45:01.955465 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" event={"ID":"1711d9ce-262c-4c6c-930a-4148e62fae9e","Type":"ContainerDied","Data":"586dd830bc412bf8d165f328ec0120d6ccafcd1b6e8c6a0642a7f4464c15681b"} Mar 13 14:45:01 crc kubenswrapper[4898]: I0313 14:45:01.957397 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" event={"ID":"1711d9ce-262c-4c6c-930a-4148e62fae9e","Type":"ContainerStarted","Data":"62729d26c2d8567b0f26cb53313f6a0403258a8124e902c3a511817667c898c7"} Mar 13 14:45:02 crc kubenswrapper[4898]: I0313 14:45:02.288343 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:45:02 crc kubenswrapper[4898]: I0313 14:45:02.288742 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:45:02 crc kubenswrapper[4898]: I0313 14:45:02.380550 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.039878 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.104248 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncdbz"] Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.439453 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.541119 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1711d9ce-262c-4c6c-930a-4148e62fae9e-secret-volume\") pod \"1711d9ce-262c-4c6c-930a-4148e62fae9e\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.541176 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm77c\" (UniqueName: \"kubernetes.io/projected/1711d9ce-262c-4c6c-930a-4148e62fae9e-kube-api-access-tm77c\") pod \"1711d9ce-262c-4c6c-930a-4148e62fae9e\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.541411 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1711d9ce-262c-4c6c-930a-4148e62fae9e-config-volume\") pod \"1711d9ce-262c-4c6c-930a-4148e62fae9e\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.542079 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1711d9ce-262c-4c6c-930a-4148e62fae9e-config-volume" (OuterVolumeSpecName: "config-volume") pod "1711d9ce-262c-4c6c-930a-4148e62fae9e" (UID: "1711d9ce-262c-4c6c-930a-4148e62fae9e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.548982 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1711d9ce-262c-4c6c-930a-4148e62fae9e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1711d9ce-262c-4c6c-930a-4148e62fae9e" (UID: "1711d9ce-262c-4c6c-930a-4148e62fae9e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.551220 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1711d9ce-262c-4c6c-930a-4148e62fae9e-kube-api-access-tm77c" (OuterVolumeSpecName: "kube-api-access-tm77c") pod "1711d9ce-262c-4c6c-930a-4148e62fae9e" (UID: "1711d9ce-262c-4c6c-930a-4148e62fae9e"). InnerVolumeSpecName "kube-api-access-tm77c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.645550 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1711d9ce-262c-4c6c-930a-4148e62fae9e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.645592 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1711d9ce-262c-4c6c-930a-4148e62fae9e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.645606 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm77c\" (UniqueName: \"kubernetes.io/projected/1711d9ce-262c-4c6c-930a-4148e62fae9e-kube-api-access-tm77c\") on node \"crc\" DevicePath \"\"" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.995213 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.995835 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" event={"ID":"1711d9ce-262c-4c6c-930a-4148e62fae9e","Type":"ContainerDied","Data":"62729d26c2d8567b0f26cb53313f6a0403258a8124e902c3a511817667c898c7"} Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.995871 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62729d26c2d8567b0f26cb53313f6a0403258a8124e902c3a511817667c898c7" Mar 13 14:45:04 crc kubenswrapper[4898]: I0313 14:45:04.541831 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h"] Mar 13 14:45:04 crc kubenswrapper[4898]: I0313 14:45:04.560632 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h"] Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.010023 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ncdbz" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerName="registry-server" containerID="cri-o://f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a" gracePeriod=2 Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.669867 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.756816 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c222126e-abe0-43e6-95c8-cc6946c967ae" path="/var/lib/kubelet/pods/c222126e-abe0-43e6-95c8-cc6946c967ae/volumes" Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.804468 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4f5x\" (UniqueName: \"kubernetes.io/projected/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-kube-api-access-q4f5x\") pod \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.804566 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-catalog-content\") pod \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.804771 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-utilities\") pod \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.805595 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-utilities" (OuterVolumeSpecName: "utilities") pod "3a91e6fb-a1ff-4dec-a854-024ff312a9b6" (UID: "3a91e6fb-a1ff-4dec-a854-024ff312a9b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.812321 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-kube-api-access-q4f5x" (OuterVolumeSpecName: "kube-api-access-q4f5x") pod "3a91e6fb-a1ff-4dec-a854-024ff312a9b6" (UID: "3a91e6fb-a1ff-4dec-a854-024ff312a9b6"). InnerVolumeSpecName "kube-api-access-q4f5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.908357 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.908614 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4f5x\" (UniqueName: \"kubernetes.io/projected/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-kube-api-access-q4f5x\") on node \"crc\" DevicePath \"\"" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.024389 4898 generic.go:334] "Generic (PLEG): container finished" podID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerID="f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a" exitCode=0 Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.024433 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncdbz" event={"ID":"3a91e6fb-a1ff-4dec-a854-024ff312a9b6","Type":"ContainerDied","Data":"f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a"} Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.024493 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncdbz" event={"ID":"3a91e6fb-a1ff-4dec-a854-024ff312a9b6","Type":"ContainerDied","Data":"4affc25dada5bfb7785755f3b2b81aead858ebbb458092ae3682efd511112a6e"} Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.024515 4898 scope.go:117] "RemoveContainer" containerID="f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.024525 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.051820 4898 scope.go:117] "RemoveContainer" containerID="dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.081148 4898 scope.go:117] "RemoveContainer" containerID="9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.159630 4898 scope.go:117] "RemoveContainer" containerID="f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a" Mar 13 14:45:06 crc kubenswrapper[4898]: E0313 14:45:06.160288 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a\": container with ID starting with f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a not found: ID does not exist" containerID="f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.160316 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a"} err="failed to get container status \"f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a\": rpc error: code = NotFound desc = could not find container \"f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a\": container with ID starting with f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a not found: ID does not exist" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.160336 4898 scope.go:117] "RemoveContainer" containerID="dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4" Mar 13 14:45:06 crc kubenswrapper[4898]: E0313 14:45:06.160813 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4\": container with ID starting with dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4 not found: ID does not exist" containerID="dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.160880 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4"} err="failed to get container status \"dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4\": rpc error: code = NotFound desc = could not find container \"dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4\": container with ID starting with dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4 not found: ID does not exist" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.160945 4898 scope.go:117] "RemoveContainer" containerID="9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79" Mar 13 14:45:06 crc kubenswrapper[4898]: E0313 14:45:06.165409 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79\": container with ID starting with 9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79 not found: ID does not exist" containerID="9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.165494 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79"} err="failed to get container status \"9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79\": rpc error: code = NotFound desc = could not find container \"9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79\": container with ID starting with 9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79 not found: ID does not exist" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.205271 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a91e6fb-a1ff-4dec-a854-024ff312a9b6" (UID: "3a91e6fb-a1ff-4dec-a854-024ff312a9b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.216822 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.376273 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncdbz"] Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.395091 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncdbz"] Mar 13 14:45:07 crc kubenswrapper[4898]: I0313 14:45:07.767981 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" path="/var/lib/kubelet/pods/3a91e6fb-a1ff-4dec-a854-024ff312a9b6/volumes" Mar 13 14:45:19 crc kubenswrapper[4898]: I0313 14:45:19.134831 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:45:19 crc kubenswrapper[4898]: I0313 14:45:19.135383 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:45:19 crc kubenswrapper[4898]: I0313 14:45:19.135432 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:45:19 crc kubenswrapper[4898]: I0313 14:45:19.136454 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:45:19 crc kubenswrapper[4898]: I0313 14:45:19.136518 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" gracePeriod=600 Mar 13 14:45:19 crc kubenswrapper[4898]: E0313 14:45:19.282236 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:45:20 crc kubenswrapper[4898]: I0313 14:45:20.230177 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" exitCode=0 Mar 13 14:45:20 crc kubenswrapper[4898]: I0313 14:45:20.230415 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5"} Mar 13 14:45:20 crc kubenswrapper[4898]: I0313 14:45:20.230572 4898 scope.go:117] "RemoveContainer" containerID="610faaa089f666c18409ed2be816ac54810449ed44b96f98a2303a40ac9c9836" Mar 13 14:45:20 crc kubenswrapper[4898]: I0313 14:45:20.231686 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:45:20 crc kubenswrapper[4898]: E0313 14:45:20.232252 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:45:31 crc kubenswrapper[4898]: I0313 14:45:31.740293 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:45:31 crc kubenswrapper[4898]: E0313 14:45:31.743839 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:45:41 crc kubenswrapper[4898]: I0313 14:45:41.384418 4898 scope.go:117] "RemoveContainer" containerID="dac7072f1900557a02d6c49c5a63ec387e9ac4b9e0b548d071acd63216fda826" Mar 13 14:45:45 crc kubenswrapper[4898]: I0313 14:45:45.746807 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:45:45 crc kubenswrapper[4898]: E0313 14:45:45.747829 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.972328 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2rfzq"] Mar 13 14:45:54 crc kubenswrapper[4898]: E0313 14:45:54.973828 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerName="extract-content" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.973850 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerName="extract-content" Mar 13 14:45:54 crc kubenswrapper[4898]: E0313 14:45:54.973878 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1711d9ce-262c-4c6c-930a-4148e62fae9e" containerName="collect-profiles" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.973956 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1711d9ce-262c-4c6c-930a-4148e62fae9e" containerName="collect-profiles" Mar 13 14:45:54 crc kubenswrapper[4898]: E0313 14:45:54.974007 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerName="registry-server" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.974021 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerName="registry-server" Mar 13 14:45:54 crc kubenswrapper[4898]: E0313 14:45:54.974051 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerName="extract-utilities" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.974064 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerName="extract-utilities" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.974504 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerName="registry-server" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.974525 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1711d9ce-262c-4c6c-930a-4148e62fae9e" containerName="collect-profiles" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.977723 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.992019 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2rfzq"] Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.084805 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95sdh\" (UniqueName: \"kubernetes.io/projected/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-kube-api-access-95sdh\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.085029 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-catalog-content\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.085080 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-utilities\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.187722 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-catalog-content\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.187799 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-utilities\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.187982 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95sdh\" (UniqueName: \"kubernetes.io/projected/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-kube-api-access-95sdh\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.188310 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-catalog-content\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.188480 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-utilities\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.211992 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95sdh\" (UniqueName: \"kubernetes.io/projected/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-kube-api-access-95sdh\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.317457 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.831514 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2rfzq"] Mar 13 14:45:56 crc kubenswrapper[4898]: I0313 14:45:56.774055 4898 generic.go:334] "Generic (PLEG): container finished" podID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerID="77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547" exitCode=0 Mar 13 14:45:56 crc kubenswrapper[4898]: I0313 14:45:56.774128 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rfzq" event={"ID":"c6f2443d-86d2-440c-8039-b04fb5eeeeb3","Type":"ContainerDied","Data":"77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547"} Mar 13 14:45:56 crc kubenswrapper[4898]: I0313 14:45:56.774510 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rfzq" event={"ID":"c6f2443d-86d2-440c-8039-b04fb5eeeeb3","Type":"ContainerStarted","Data":"90b947e908accbf4ac70479773da53012e339d540d41d5d9eb2140bc0236f4bf"} Mar 13 14:45:58 crc kubenswrapper[4898]: I0313 14:45:58.740117 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:45:58 crc kubenswrapper[4898]: E0313 14:45:58.741259 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:45:58 crc kubenswrapper[4898]: I0313 14:45:58.814119 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rfzq" event={"ID":"c6f2443d-86d2-440c-8039-b04fb5eeeeb3","Type":"ContainerStarted","Data":"782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057"} Mar 13 14:45:59 crc kubenswrapper[4898]: I0313 14:45:59.829379 4898 generic.go:334] "Generic (PLEG): container finished" podID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerID="782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057" exitCode=0 Mar 13 14:45:59 crc kubenswrapper[4898]: I0313 14:45:59.829472 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rfzq" event={"ID":"c6f2443d-86d2-440c-8039-b04fb5eeeeb3","Type":"ContainerDied","Data":"782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057"} Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.161309 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556886-vxgm7"] Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.164847 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.169632 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.171683 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.174442 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.185065 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556886-vxgm7"] Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.250537 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gklq7\" (UniqueName: \"kubernetes.io/projected/7e6f3996-1b26-4a53-8c2d-f74aa89ef944-kube-api-access-gklq7\") pod \"auto-csr-approver-29556886-vxgm7\" (UID: \"7e6f3996-1b26-4a53-8c2d-f74aa89ef944\") " pod="openshift-infra/auto-csr-approver-29556886-vxgm7" Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.353234 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gklq7\" (UniqueName: \"kubernetes.io/projected/7e6f3996-1b26-4a53-8c2d-f74aa89ef944-kube-api-access-gklq7\") pod \"auto-csr-approver-29556886-vxgm7\" (UID: \"7e6f3996-1b26-4a53-8c2d-f74aa89ef944\") " pod="openshift-infra/auto-csr-approver-29556886-vxgm7" Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.374477 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gklq7\" (UniqueName: \"kubernetes.io/projected/7e6f3996-1b26-4a53-8c2d-f74aa89ef944-kube-api-access-gklq7\") pod \"auto-csr-approver-29556886-vxgm7\" (UID: \"7e6f3996-1b26-4a53-8c2d-f74aa89ef944\") " pod="openshift-infra/auto-csr-approver-29556886-vxgm7" Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.501946 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.841042 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rfzq" event={"ID":"c6f2443d-86d2-440c-8039-b04fb5eeeeb3","Type":"ContainerStarted","Data":"fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d"} Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.865034 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2rfzq" podStartSLOduration=3.359263488 podStartE2EDuration="6.865015662s" podCreationTimestamp="2026-03-13 14:45:54 +0000 UTC" firstStartedPulling="2026-03-13 14:45:56.77751347 +0000 UTC m=+2991.779101719" lastFinishedPulling="2026-03-13 14:46:00.283265644 +0000 UTC m=+2995.284853893" observedRunningTime="2026-03-13 14:46:00.856861502 +0000 UTC m=+2995.858449741" watchObservedRunningTime="2026-03-13 14:46:00.865015662 +0000 UTC m=+2995.866603901" Mar 13 14:46:01 crc kubenswrapper[4898]: I0313 14:46:01.017678 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556886-vxgm7"] Mar 13 14:46:01 crc kubenswrapper[4898]: I0313 14:46:01.858650 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" event={"ID":"7e6f3996-1b26-4a53-8c2d-f74aa89ef944","Type":"ContainerStarted","Data":"2babf4d42215345ab890f111f267fb619ebc479d5a10a4bdbcb5c63efa902834"} Mar 13 14:46:02 crc kubenswrapper[4898]: I0313 14:46:02.872333 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" event={"ID":"7e6f3996-1b26-4a53-8c2d-f74aa89ef944","Type":"ContainerStarted","Data":"c50d215e83c79e2c4ba98ed3d21204bd33821f4c3e54e5173055aa94bae3e42e"} Mar 13 14:46:02 crc kubenswrapper[4898]: I0313 14:46:02.915237 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" podStartSLOduration=1.592971773 podStartE2EDuration="2.915217128s" podCreationTimestamp="2026-03-13 14:46:00 +0000 UTC" firstStartedPulling="2026-03-13 14:46:01.019800065 +0000 UTC m=+2996.021388304" lastFinishedPulling="2026-03-13 14:46:02.34204539 +0000 UTC m=+2997.343633659" observedRunningTime="2026-03-13 14:46:02.892771265 +0000 UTC m=+2997.894359524" watchObservedRunningTime="2026-03-13 14:46:02.915217128 +0000 UTC m=+2997.916805377" Mar 13 14:46:03 crc kubenswrapper[4898]: I0313 14:46:03.884184 4898 generic.go:334] "Generic (PLEG): container finished" podID="7e6f3996-1b26-4a53-8c2d-f74aa89ef944" containerID="c50d215e83c79e2c4ba98ed3d21204bd33821f4c3e54e5173055aa94bae3e42e" exitCode=0 Mar 13 14:46:03 crc kubenswrapper[4898]: I0313 14:46:03.884255 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" event={"ID":"7e6f3996-1b26-4a53-8c2d-f74aa89ef944","Type":"ContainerDied","Data":"c50d215e83c79e2c4ba98ed3d21204bd33821f4c3e54e5173055aa94bae3e42e"} Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.320033 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.321367 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.428667 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.697805 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.805141 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gklq7\" (UniqueName: \"kubernetes.io/projected/7e6f3996-1b26-4a53-8c2d-f74aa89ef944-kube-api-access-gklq7\") pod \"7e6f3996-1b26-4a53-8c2d-f74aa89ef944\" (UID: \"7e6f3996-1b26-4a53-8c2d-f74aa89ef944\") " Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.810962 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e6f3996-1b26-4a53-8c2d-f74aa89ef944-kube-api-access-gklq7" (OuterVolumeSpecName: "kube-api-access-gklq7") pod "7e6f3996-1b26-4a53-8c2d-f74aa89ef944" (UID: "7e6f3996-1b26-4a53-8c2d-f74aa89ef944"). InnerVolumeSpecName "kube-api-access-gklq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.910137 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gklq7\" (UniqueName: \"kubernetes.io/projected/7e6f3996-1b26-4a53-8c2d-f74aa89ef944-kube-api-access-gklq7\") on node \"crc\" DevicePath \"\"" Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.916639 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" event={"ID":"7e6f3996-1b26-4a53-8c2d-f74aa89ef944","Type":"ContainerDied","Data":"2babf4d42215345ab890f111f267fb619ebc479d5a10a4bdbcb5c63efa902834"} Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.916710 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2babf4d42215345ab890f111f267fb619ebc479d5a10a4bdbcb5c63efa902834" Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.917013 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.964134 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556880-2rlcl"] Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.975563 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:46:06 crc kubenswrapper[4898]: I0313 14:46:06.004154 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556880-2rlcl"] Mar 13 14:46:06 crc kubenswrapper[4898]: I0313 14:46:06.029189 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2rfzq"] Mar 13 14:46:07 crc kubenswrapper[4898]: I0313 14:46:07.768668 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23d4e2ed-7457-458a-9c76-dcf8f3aadd99" path="/var/lib/kubelet/pods/23d4e2ed-7457-458a-9c76-dcf8f3aadd99/volumes" Mar 13 14:46:07 crc kubenswrapper[4898]: I0313 14:46:07.941132 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2rfzq" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerName="registry-server" containerID="cri-o://fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d" gracePeriod=2 Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.569402 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.686693 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-catalog-content\") pod \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.686775 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-utilities\") pod \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.686800 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95sdh\" (UniqueName: \"kubernetes.io/projected/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-kube-api-access-95sdh\") pod \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.687784 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-utilities" (OuterVolumeSpecName: "utilities") pod "c6f2443d-86d2-440c-8039-b04fb5eeeeb3" (UID: "c6f2443d-86d2-440c-8039-b04fb5eeeeb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.693262 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-kube-api-access-95sdh" (OuterVolumeSpecName: "kube-api-access-95sdh") pod "c6f2443d-86d2-440c-8039-b04fb5eeeeb3" (UID: "c6f2443d-86d2-440c-8039-b04fb5eeeeb3"). InnerVolumeSpecName "kube-api-access-95sdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.771116 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6f2443d-86d2-440c-8039-b04fb5eeeeb3" (UID: "c6f2443d-86d2-440c-8039-b04fb5eeeeb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.789375 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.789402 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.789412 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95sdh\" (UniqueName: \"kubernetes.io/projected/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-kube-api-access-95sdh\") on node \"crc\" DevicePath \"\"" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.960117 4898 generic.go:334] "Generic (PLEG): container finished" podID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerID="fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d" exitCode=0 Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.960220 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.960212 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rfzq" event={"ID":"c6f2443d-86d2-440c-8039-b04fb5eeeeb3","Type":"ContainerDied","Data":"fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d"} Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.960512 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rfzq" event={"ID":"c6f2443d-86d2-440c-8039-b04fb5eeeeb3","Type":"ContainerDied","Data":"90b947e908accbf4ac70479773da53012e339d540d41d5d9eb2140bc0236f4bf"} Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.960544 4898 scope.go:117] "RemoveContainer" containerID="fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.999832 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2rfzq"] Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:08.999980 4898 scope.go:117] "RemoveContainer" containerID="782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057" Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.012190 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2rfzq"] Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.037367 4898 scope.go:117] "RemoveContainer" containerID="77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547" Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.132303 4898 scope.go:117] "RemoveContainer" containerID="fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d" Mar 13 14:46:09 crc kubenswrapper[4898]: E0313 14:46:09.132838 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d\": container with ID starting with fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d not found: ID does not exist" containerID="fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d" Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.132877 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d"} err="failed to get container status \"fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d\": rpc error: code = NotFound desc = could not find container \"fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d\": container with ID starting with fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d not found: ID does not exist" Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.132918 4898 scope.go:117] "RemoveContainer" containerID="782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057" Mar 13 14:46:09 crc kubenswrapper[4898]: E0313 14:46:09.133202 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057\": container with ID starting with 782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057 not found: ID does not exist" containerID="782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057" Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.133232 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057"} err="failed to get container status \"782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057\": rpc error: code = NotFound desc = could not find container \"782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057\": container with ID starting with 782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057 not found: ID does not exist" Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.133249 4898 scope.go:117] "RemoveContainer" containerID="77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547" Mar 13 14:46:09 crc kubenswrapper[4898]: E0313 14:46:09.133559 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547\": container with ID starting with 77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547 not found: ID does not exist" containerID="77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547" Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.133587 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547"} err="failed to get container status \"77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547\": rpc error: code = NotFound desc = could not find container \"77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547\": container with ID starting with 77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547 not found: ID does not exist" Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.766805 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" path="/var/lib/kubelet/pods/c6f2443d-86d2-440c-8039-b04fb5eeeeb3/volumes" Mar 13 14:46:12 crc kubenswrapper[4898]: I0313 14:46:12.741244 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:46:12 crc kubenswrapper[4898]: E0313 14:46:12.742237 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:46:25 crc kubenswrapper[4898]: I0313 14:46:25.740503 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:46:25 crc kubenswrapper[4898]: E0313 14:46:25.742235 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:46:40 crc kubenswrapper[4898]: I0313 14:46:40.740154 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:46:40 crc kubenswrapper[4898]: E0313 14:46:40.740941 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:46:41 crc kubenswrapper[4898]: I0313 14:46:41.527386 4898 scope.go:117] "RemoveContainer" containerID="a4b672dd5f62f7db5f72a2ba461417e4b17e1ad5affad388a08bfa992e5aa45e" Mar 13 14:46:51 crc kubenswrapper[4898]: I0313 14:46:51.740513 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:46:51 crc kubenswrapper[4898]: E0313 14:46:51.741565 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:46:58 crc kubenswrapper[4898]: I0313 14:46:58.678388 4898 generic.go:334] "Generic (PLEG): container finished" podID="9a62fd58-a586-4473-abfe-4e227cad9900" containerID="b68526abfbbd65229d0fa636d4985f81e76151523259cd0cea1b6513d33ed080" exitCode=0 Mar 13 14:46:58 crc kubenswrapper[4898]: I0313 14:46:58.678823 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" event={"ID":"9a62fd58-a586-4473-abfe-4e227cad9900","Type":"ContainerDied","Data":"b68526abfbbd65229d0fa636d4985f81e76151523259cd0cea1b6513d33ed080"} Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.336566 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.438383 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-inventory\") pod \"9a62fd58-a586-4473-abfe-4e227cad9900\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.438722 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ssh-key-openstack-edpm-ipam\") pod \"9a62fd58-a586-4473-abfe-4e227cad9900\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.438766 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-1\") pod \"9a62fd58-a586-4473-abfe-4e227cad9900\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.438829 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-telemetry-combined-ca-bundle\") pod \"9a62fd58-a586-4473-abfe-4e227cad9900\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.438958 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-0\") pod \"9a62fd58-a586-4473-abfe-4e227cad9900\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.438987 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmwct\" (UniqueName: \"kubernetes.io/projected/9a62fd58-a586-4473-abfe-4e227cad9900-kube-api-access-vmwct\") pod \"9a62fd58-a586-4473-abfe-4e227cad9900\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.439076 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-2\") pod \"9a62fd58-a586-4473-abfe-4e227cad9900\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.444184 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9a62fd58-a586-4473-abfe-4e227cad9900" (UID: "9a62fd58-a586-4473-abfe-4e227cad9900"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.446829 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a62fd58-a586-4473-abfe-4e227cad9900-kube-api-access-vmwct" (OuterVolumeSpecName: "kube-api-access-vmwct") pod "9a62fd58-a586-4473-abfe-4e227cad9900" (UID: "9a62fd58-a586-4473-abfe-4e227cad9900"). InnerVolumeSpecName "kube-api-access-vmwct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.474237 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "9a62fd58-a586-4473-abfe-4e227cad9900" (UID: "9a62fd58-a586-4473-abfe-4e227cad9900"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.476707 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-inventory" (OuterVolumeSpecName: "inventory") pod "9a62fd58-a586-4473-abfe-4e227cad9900" (UID: "9a62fd58-a586-4473-abfe-4e227cad9900"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.476781 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9a62fd58-a586-4473-abfe-4e227cad9900" (UID: "9a62fd58-a586-4473-abfe-4e227cad9900"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.500551 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "9a62fd58-a586-4473-abfe-4e227cad9900" (UID: "9a62fd58-a586-4473-abfe-4e227cad9900"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.510663 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "9a62fd58-a586-4473-abfe-4e227cad9900" (UID: "9a62fd58-a586-4473-abfe-4e227cad9900"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.542416 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.542451 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmwct\" (UniqueName: \"kubernetes.io/projected/9a62fd58-a586-4473-abfe-4e227cad9900-kube-api-access-vmwct\") on node \"crc\" DevicePath \"\"" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.542463 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.542474 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.542484 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.542493 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.542503 4898 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.708454 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" event={"ID":"9a62fd58-a586-4473-abfe-4e227cad9900","Type":"ContainerDied","Data":"eb475c12bddce2c97f158483e36ec6344049764d71c6a03112569a01e066938c"} Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.708511 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb475c12bddce2c97f158483e36ec6344049764d71c6a03112569a01e066938c" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.708607 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.837195 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6"] Mar 13 14:47:00 crc kubenswrapper[4898]: E0313 14:47:00.837739 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6f3996-1b26-4a53-8c2d-f74aa89ef944" containerName="oc" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.837755 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6f3996-1b26-4a53-8c2d-f74aa89ef944" containerName="oc" Mar 13 14:47:00 crc kubenswrapper[4898]: E0313 14:47:00.837772 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerName="registry-server" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.837779 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerName="registry-server" Mar 13 14:47:00 crc kubenswrapper[4898]: E0313 14:47:00.837795 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerName="extract-utilities" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.837802 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerName="extract-utilities" Mar 13 14:47:00 crc kubenswrapper[4898]: E0313 14:47:00.837819 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerName="extract-content" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.837825 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerName="extract-content" Mar 13 14:47:00 crc kubenswrapper[4898]: E0313 14:47:00.837849 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a62fd58-a586-4473-abfe-4e227cad9900" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.837857 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a62fd58-a586-4473-abfe-4e227cad9900" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.838114 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e6f3996-1b26-4a53-8c2d-f74aa89ef944" containerName="oc" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.838149 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a62fd58-a586-4473-abfe-4e227cad9900" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.838167 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerName="registry-server" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.838984 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.841804 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.842152 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.842376 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.843394 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.844063 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.847896 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6"] Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.963694 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.963981 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.964561 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.964627 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg5fp\" (UniqueName: \"kubernetes.io/projected/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-kube-api-access-lg5fp\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.964666 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.964745 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.964927 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.067000 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.067090 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.067143 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.067292 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.067324 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg5fp\" (UniqueName: \"kubernetes.io/projected/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-kube-api-access-lg5fp\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.067350 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.067390 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.072757 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.073195 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.073401 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.074522 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.074607 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.074687 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.093881 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg5fp\" (UniqueName: \"kubernetes.io/projected/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-kube-api-access-lg5fp\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.169666 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.926674 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6"] Mar 13 14:47:02 crc kubenswrapper[4898]: I0313 14:47:02.736013 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" event={"ID":"5139c85e-1d3d-4fe7-94aa-8efde03b43e0","Type":"ContainerStarted","Data":"3f68a521805ceaebc4381fbbd273e0827e9decfdbc53a23464118ffd0aba9594"} Mar 13 14:47:02 crc kubenswrapper[4898]: I0313 14:47:02.739427 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:47:02 crc kubenswrapper[4898]: E0313 14:47:02.739807 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:47:03 crc kubenswrapper[4898]: I0313 14:47:03.785061 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" event={"ID":"5139c85e-1d3d-4fe7-94aa-8efde03b43e0","Type":"ContainerStarted","Data":"fa89130040e3f48f6b09e015edf3ef67fd27d3f545d629365a77745abc0aef24"} Mar 13 14:47:03 crc kubenswrapper[4898]: I0313 14:47:03.812479 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" podStartSLOduration=3.2589876 podStartE2EDuration="3.812457792s" podCreationTimestamp="2026-03-13 14:47:00 +0000 UTC" firstStartedPulling="2026-03-13 14:47:01.9215601 +0000 UTC m=+3056.923148379" lastFinishedPulling="2026-03-13 14:47:02.475030292 +0000 UTC m=+3057.476618571" observedRunningTime="2026-03-13 14:47:03.798810435 +0000 UTC m=+3058.800398684" watchObservedRunningTime="2026-03-13 14:47:03.812457792 +0000 UTC m=+3058.814046041" Mar 13 14:47:14 crc kubenswrapper[4898]: I0313 14:47:14.740096 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:47:14 crc kubenswrapper[4898]: E0313 14:47:14.741220 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:47:27 crc kubenswrapper[4898]: I0313 14:47:27.763353 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:47:27 crc kubenswrapper[4898]: E0313 14:47:27.764614 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:47:38 crc kubenswrapper[4898]: I0313 14:47:38.741091 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:47:38 crc kubenswrapper[4898]: E0313 14:47:38.743856 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:47:51 crc kubenswrapper[4898]: I0313 14:47:51.740600 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:47:51 crc kubenswrapper[4898]: E0313 14:47:51.741763 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.168721 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556888-gn68v"] Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.174293 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556888-gn68v" Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.183292 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556888-gn68v"] Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.183680 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.183933 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.184079 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.256473 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh2vk\" (UniqueName: \"kubernetes.io/projected/2813f8b3-81f9-48a3-9a55-173dead5d7a7-kube-api-access-dh2vk\") pod \"auto-csr-approver-29556888-gn68v\" (UID: \"2813f8b3-81f9-48a3-9a55-173dead5d7a7\") " pod="openshift-infra/auto-csr-approver-29556888-gn68v" Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.360049 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh2vk\" (UniqueName: \"kubernetes.io/projected/2813f8b3-81f9-48a3-9a55-173dead5d7a7-kube-api-access-dh2vk\") pod \"auto-csr-approver-29556888-gn68v\" (UID: \"2813f8b3-81f9-48a3-9a55-173dead5d7a7\") " pod="openshift-infra/auto-csr-approver-29556888-gn68v" Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.380427 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh2vk\" (UniqueName: \"kubernetes.io/projected/2813f8b3-81f9-48a3-9a55-173dead5d7a7-kube-api-access-dh2vk\") pod \"auto-csr-approver-29556888-gn68v\" (UID: \"2813f8b3-81f9-48a3-9a55-173dead5d7a7\") " pod="openshift-infra/auto-csr-approver-29556888-gn68v" Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.504824 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556888-gn68v" Mar 13 14:48:01 crc kubenswrapper[4898]: I0313 14:48:01.008438 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556888-gn68v"] Mar 13 14:48:01 crc kubenswrapper[4898]: I0313 14:48:01.484655 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556888-gn68v" event={"ID":"2813f8b3-81f9-48a3-9a55-173dead5d7a7","Type":"ContainerStarted","Data":"6c71d25075e5d652a353986d1a9cd606119869eb24adf85caa486d10c30b2418"} Mar 13 14:48:03 crc kubenswrapper[4898]: I0313 14:48:03.514895 4898 generic.go:334] "Generic (PLEG): container finished" podID="2813f8b3-81f9-48a3-9a55-173dead5d7a7" containerID="916c6a5f721d9ac559010e8682bf9fc9eec602069822f7f9c7c610579bc40a49" exitCode=0 Mar 13 14:48:03 crc kubenswrapper[4898]: I0313 14:48:03.515193 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556888-gn68v" event={"ID":"2813f8b3-81f9-48a3-9a55-173dead5d7a7","Type":"ContainerDied","Data":"916c6a5f721d9ac559010e8682bf9fc9eec602069822f7f9c7c610579bc40a49"} Mar 13 14:48:03 crc kubenswrapper[4898]: I0313 14:48:03.739837 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:48:03 crc kubenswrapper[4898]: E0313 14:48:03.740339 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:48:05 crc kubenswrapper[4898]: I0313 14:48:05.004221 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556888-gn68v" Mar 13 14:48:05 crc kubenswrapper[4898]: I0313 14:48:05.087394 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh2vk\" (UniqueName: \"kubernetes.io/projected/2813f8b3-81f9-48a3-9a55-173dead5d7a7-kube-api-access-dh2vk\") pod \"2813f8b3-81f9-48a3-9a55-173dead5d7a7\" (UID: \"2813f8b3-81f9-48a3-9a55-173dead5d7a7\") " Mar 13 14:48:05 crc kubenswrapper[4898]: I0313 14:48:05.094193 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2813f8b3-81f9-48a3-9a55-173dead5d7a7-kube-api-access-dh2vk" (OuterVolumeSpecName: "kube-api-access-dh2vk") pod "2813f8b3-81f9-48a3-9a55-173dead5d7a7" (UID: "2813f8b3-81f9-48a3-9a55-173dead5d7a7"). InnerVolumeSpecName "kube-api-access-dh2vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:48:05 crc kubenswrapper[4898]: I0313 14:48:05.190780 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh2vk\" (UniqueName: \"kubernetes.io/projected/2813f8b3-81f9-48a3-9a55-173dead5d7a7-kube-api-access-dh2vk\") on node \"crc\" DevicePath \"\"" Mar 13 14:48:05 crc kubenswrapper[4898]: I0313 14:48:05.541351 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556888-gn68v" event={"ID":"2813f8b3-81f9-48a3-9a55-173dead5d7a7","Type":"ContainerDied","Data":"6c71d25075e5d652a353986d1a9cd606119869eb24adf85caa486d10c30b2418"} Mar 13 14:48:05 crc kubenswrapper[4898]: I0313 14:48:05.541390 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c71d25075e5d652a353986d1a9cd606119869eb24adf85caa486d10c30b2418" Mar 13 14:48:05 crc kubenswrapper[4898]: I0313 14:48:05.541984 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556888-gn68v" Mar 13 14:48:06 crc kubenswrapper[4898]: I0313 14:48:06.486263 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556882-6vdgp"] Mar 13 14:48:06 crc kubenswrapper[4898]: I0313 14:48:06.501332 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556882-6vdgp"] Mar 13 14:48:07 crc kubenswrapper[4898]: I0313 14:48:07.757859 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46dff21f-c9aa-443a-b1c7-988721788744" path="/var/lib/kubelet/pods/46dff21f-c9aa-443a-b1c7-988721788744/volumes" Mar 13 14:48:16 crc kubenswrapper[4898]: I0313 14:48:16.739612 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:48:16 crc kubenswrapper[4898]: E0313 14:48:16.740431 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:48:27 crc kubenswrapper[4898]: I0313 14:48:27.740130 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:48:27 crc kubenswrapper[4898]: E0313 14:48:27.741418 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:48:39 crc kubenswrapper[4898]: I0313 14:48:39.740364 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:48:39 crc kubenswrapper[4898]: E0313 14:48:39.741442 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:48:41 crc kubenswrapper[4898]: I0313 14:48:41.676969 4898 scope.go:117] "RemoveContainer" containerID="5dfdf7dc37e2c03d23dcf11c2bda6721f5e0189a55bee1c413509ed3a8808306" Mar 13 14:48:53 crc kubenswrapper[4898]: I0313 14:48:53.740857 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:48:53 crc kubenswrapper[4898]: E0313 14:48:53.742266 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:49:05 crc kubenswrapper[4898]: I0313 14:49:04.739785 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:49:05 crc kubenswrapper[4898]: E0313 14:49:04.741236 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:49:06 crc kubenswrapper[4898]: I0313 14:49:06.362307 4898 generic.go:334] "Generic (PLEG): container finished" podID="5139c85e-1d3d-4fe7-94aa-8efde03b43e0" containerID="fa89130040e3f48f6b09e015edf3ef67fd27d3f545d629365a77745abc0aef24" exitCode=0 Mar 13 14:49:06 crc kubenswrapper[4898]: I0313 14:49:06.362412 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" event={"ID":"5139c85e-1d3d-4fe7-94aa-8efde03b43e0","Type":"ContainerDied","Data":"fa89130040e3f48f6b09e015edf3ef67fd27d3f545d629365a77745abc0aef24"} Mar 13 14:49:07 crc kubenswrapper[4898]: I0313 14:49:07.951868 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.053415 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-telemetry-power-monitoring-combined-ca-bundle\") pod \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.053494 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-2\") pod \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.053669 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ssh-key-openstack-edpm-ipam\") pod \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.053719 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-0\") pod \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.053819 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-inventory\") pod \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.053873 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg5fp\" (UniqueName: \"kubernetes.io/projected/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-kube-api-access-lg5fp\") pod \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.053933 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-1\") pod \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.059884 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "5139c85e-1d3d-4fe7-94aa-8efde03b43e0" (UID: "5139c85e-1d3d-4fe7-94aa-8efde03b43e0"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.059917 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-kube-api-access-lg5fp" (OuterVolumeSpecName: "kube-api-access-lg5fp") pod "5139c85e-1d3d-4fe7-94aa-8efde03b43e0" (UID: "5139c85e-1d3d-4fe7-94aa-8efde03b43e0"). InnerVolumeSpecName "kube-api-access-lg5fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.084076 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "5139c85e-1d3d-4fe7-94aa-8efde03b43e0" (UID: "5139c85e-1d3d-4fe7-94aa-8efde03b43e0"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.094993 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "5139c85e-1d3d-4fe7-94aa-8efde03b43e0" (UID: "5139c85e-1d3d-4fe7-94aa-8efde03b43e0"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.096346 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "5139c85e-1d3d-4fe7-94aa-8efde03b43e0" (UID: "5139c85e-1d3d-4fe7-94aa-8efde03b43e0"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.099833 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-inventory" (OuterVolumeSpecName: "inventory") pod "5139c85e-1d3d-4fe7-94aa-8efde03b43e0" (UID: "5139c85e-1d3d-4fe7-94aa-8efde03b43e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.105087 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5139c85e-1d3d-4fe7-94aa-8efde03b43e0" (UID: "5139c85e-1d3d-4fe7-94aa-8efde03b43e0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.157381 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.157409 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.157421 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.157430 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg5fp\" (UniqueName: \"kubernetes.io/projected/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-kube-api-access-lg5fp\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.157439 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.157448 4898 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.157461 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.394363 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" event={"ID":"5139c85e-1d3d-4fe7-94aa-8efde03b43e0","Type":"ContainerDied","Data":"3f68a521805ceaebc4381fbbd273e0827e9decfdbc53a23464118ffd0aba9594"} Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.394713 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f68a521805ceaebc4381fbbd273e0827e9decfdbc53a23464118ffd0aba9594" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.394586 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.530750 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885"] Mar 13 14:49:08 crc kubenswrapper[4898]: E0313 14:49:08.531315 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5139c85e-1d3d-4fe7-94aa-8efde03b43e0" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.531337 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5139c85e-1d3d-4fe7-94aa-8efde03b43e0" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 13 14:49:08 crc kubenswrapper[4898]: E0313 14:49:08.531366 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2813f8b3-81f9-48a3-9a55-173dead5d7a7" containerName="oc" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.531375 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2813f8b3-81f9-48a3-9a55-173dead5d7a7" containerName="oc" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.531637 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2813f8b3-81f9-48a3-9a55-173dead5d7a7" containerName="oc" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.531664 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5139c85e-1d3d-4fe7-94aa-8efde03b43e0" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.535412 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.541162 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.541954 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.542576 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.545055 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.550273 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.555460 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885"] Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.568479 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.568818 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.568991 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.569139 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.569259 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww6vk\" (UniqueName: \"kubernetes.io/projected/efff948d-3073-4635-bc2c-2a8fc746c6b8-kube-api-access-ww6vk\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.671182 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.671270 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.671368 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.671458 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww6vk\" (UniqueName: \"kubernetes.io/projected/efff948d-3073-4635-bc2c-2a8fc746c6b8-kube-api-access-ww6vk\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.671643 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.690987 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.691001 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.691194 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.691204 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.694274 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww6vk\" (UniqueName: \"kubernetes.io/projected/efff948d-3073-4635-bc2c-2a8fc746c6b8-kube-api-access-ww6vk\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.854875 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:09 crc kubenswrapper[4898]: I0313 14:49:09.443128 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885"] Mar 13 14:49:09 crc kubenswrapper[4898]: W0313 14:49:09.447302 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefff948d_3073_4635_bc2c_2a8fc746c6b8.slice/crio-0e670a0eb159f93d979313e41f2d96d2d655eb9fa6cd260efb1423c9ebf3ac66 WatchSource:0}: Error finding container 0e670a0eb159f93d979313e41f2d96d2d655eb9fa6cd260efb1423c9ebf3ac66: Status 404 returned error can't find the container with id 0e670a0eb159f93d979313e41f2d96d2d655eb9fa6cd260efb1423c9ebf3ac66 Mar 13 14:49:09 crc kubenswrapper[4898]: I0313 14:49:09.452267 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:49:10 crc kubenswrapper[4898]: I0313 14:49:10.417553 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" event={"ID":"efff948d-3073-4635-bc2c-2a8fc746c6b8","Type":"ContainerStarted","Data":"56b8fdcde2f4108f5197df0d9ca2d30c280b616e640941f05782549d26cd35cd"} Mar 13 14:49:10 crc kubenswrapper[4898]: I0313 14:49:10.417992 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" event={"ID":"efff948d-3073-4635-bc2c-2a8fc746c6b8","Type":"ContainerStarted","Data":"0e670a0eb159f93d979313e41f2d96d2d655eb9fa6cd260efb1423c9ebf3ac66"} Mar 13 14:49:10 crc kubenswrapper[4898]: I0313 14:49:10.460662 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" podStartSLOduration=1.9194945159999999 podStartE2EDuration="2.460647655s" podCreationTimestamp="2026-03-13 14:49:08 +0000 UTC" firstStartedPulling="2026-03-13 14:49:09.452016912 +0000 UTC m=+3184.453605141" lastFinishedPulling="2026-03-13 14:49:09.993170001 +0000 UTC m=+3184.994758280" observedRunningTime="2026-03-13 14:49:10.45397294 +0000 UTC m=+3185.455561179" watchObservedRunningTime="2026-03-13 14:49:10.460647655 +0000 UTC m=+3185.462235894" Mar 13 14:49:16 crc kubenswrapper[4898]: I0313 14:49:16.739952 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:49:16 crc kubenswrapper[4898]: E0313 14:49:16.741240 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:49:25 crc kubenswrapper[4898]: I0313 14:49:25.660244 4898 generic.go:334] "Generic (PLEG): container finished" podID="efff948d-3073-4635-bc2c-2a8fc746c6b8" containerID="56b8fdcde2f4108f5197df0d9ca2d30c280b616e640941f05782549d26cd35cd" exitCode=0 Mar 13 14:49:25 crc kubenswrapper[4898]: I0313 14:49:25.660365 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" event={"ID":"efff948d-3073-4635-bc2c-2a8fc746c6b8","Type":"ContainerDied","Data":"56b8fdcde2f4108f5197df0d9ca2d30c280b616e640941f05782549d26cd35cd"} Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.239567 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.412356 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-1\") pod \"efff948d-3073-4635-bc2c-2a8fc746c6b8\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.412455 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-0\") pod \"efff948d-3073-4635-bc2c-2a8fc746c6b8\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.412517 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-ssh-key-openstack-edpm-ipam\") pod \"efff948d-3073-4635-bc2c-2a8fc746c6b8\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.412641 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-inventory\") pod \"efff948d-3073-4635-bc2c-2a8fc746c6b8\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.412763 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww6vk\" (UniqueName: \"kubernetes.io/projected/efff948d-3073-4635-bc2c-2a8fc746c6b8-kube-api-access-ww6vk\") pod \"efff948d-3073-4635-bc2c-2a8fc746c6b8\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.422180 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efff948d-3073-4635-bc2c-2a8fc746c6b8-kube-api-access-ww6vk" (OuterVolumeSpecName: "kube-api-access-ww6vk") pod "efff948d-3073-4635-bc2c-2a8fc746c6b8" (UID: "efff948d-3073-4635-bc2c-2a8fc746c6b8"). InnerVolumeSpecName "kube-api-access-ww6vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.465136 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "efff948d-3073-4635-bc2c-2a8fc746c6b8" (UID: "efff948d-3073-4635-bc2c-2a8fc746c6b8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.467033 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "efff948d-3073-4635-bc2c-2a8fc746c6b8" (UID: "efff948d-3073-4635-bc2c-2a8fc746c6b8"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.467915 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "efff948d-3073-4635-bc2c-2a8fc746c6b8" (UID: "efff948d-3073-4635-bc2c-2a8fc746c6b8"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.475480 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-inventory" (OuterVolumeSpecName: "inventory") pod "efff948d-3073-4635-bc2c-2a8fc746c6b8" (UID: "efff948d-3073-4635-bc2c-2a8fc746c6b8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.516968 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww6vk\" (UniqueName: \"kubernetes.io/projected/efff948d-3073-4635-bc2c-2a8fc746c6b8-kube-api-access-ww6vk\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.517010 4898 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.517024 4898 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.517040 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.517052 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.690488 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" event={"ID":"efff948d-3073-4635-bc2c-2a8fc746c6b8","Type":"ContainerDied","Data":"0e670a0eb159f93d979313e41f2d96d2d655eb9fa6cd260efb1423c9ebf3ac66"} Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.690529 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e670a0eb159f93d979313e41f2d96d2d655eb9fa6cd260efb1423c9ebf3ac66" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.690557 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:29 crc kubenswrapper[4898]: I0313 14:49:29.740503 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:49:29 crc kubenswrapper[4898]: E0313 14:49:29.741615 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:49:42 crc kubenswrapper[4898]: I0313 14:49:42.739985 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:49:42 crc kubenswrapper[4898]: E0313 14:49:42.742769 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:49:56 crc kubenswrapper[4898]: I0313 14:49:56.741528 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:49:56 crc kubenswrapper[4898]: E0313 14:49:56.742775 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.172670 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556890-vl72z"] Mar 13 14:50:00 crc kubenswrapper[4898]: E0313 14:50:00.174311 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efff948d-3073-4635-bc2c-2a8fc746c6b8" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.174350 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="efff948d-3073-4635-bc2c-2a8fc746c6b8" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.175169 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="efff948d-3073-4635-bc2c-2a8fc746c6b8" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.177310 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556890-vl72z" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.184706 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.184973 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.185368 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.190317 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556890-vl72z"] Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.341478 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pgxx\" (UniqueName: \"kubernetes.io/projected/5795e677-fa9a-4235-9a30-a040ac18eebd-kube-api-access-9pgxx\") pod \"auto-csr-approver-29556890-vl72z\" (UID: \"5795e677-fa9a-4235-9a30-a040ac18eebd\") " pod="openshift-infra/auto-csr-approver-29556890-vl72z" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.444954 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pgxx\" (UniqueName: \"kubernetes.io/projected/5795e677-fa9a-4235-9a30-a040ac18eebd-kube-api-access-9pgxx\") pod \"auto-csr-approver-29556890-vl72z\" (UID: \"5795e677-fa9a-4235-9a30-a040ac18eebd\") " pod="openshift-infra/auto-csr-approver-29556890-vl72z" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.487913 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pgxx\" (UniqueName: \"kubernetes.io/projected/5795e677-fa9a-4235-9a30-a040ac18eebd-kube-api-access-9pgxx\") pod \"auto-csr-approver-29556890-vl72z\" (UID: \"5795e677-fa9a-4235-9a30-a040ac18eebd\") " pod="openshift-infra/auto-csr-approver-29556890-vl72z" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.522172 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556890-vl72z" Mar 13 14:50:01 crc kubenswrapper[4898]: I0313 14:50:01.032040 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556890-vl72z"] Mar 13 14:50:01 crc kubenswrapper[4898]: W0313 14:50:01.040037 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5795e677_fa9a_4235_9a30_a040ac18eebd.slice/crio-0940c7aea401f92f6b87b35355fc8bb2b356dd83a404368f92cf4cfa1c5617b9 WatchSource:0}: Error finding container 0940c7aea401f92f6b87b35355fc8bb2b356dd83a404368f92cf4cfa1c5617b9: Status 404 returned error can't find the container with id 0940c7aea401f92f6b87b35355fc8bb2b356dd83a404368f92cf4cfa1c5617b9 Mar 13 14:50:01 crc kubenswrapper[4898]: I0313 14:50:01.141948 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556890-vl72z" event={"ID":"5795e677-fa9a-4235-9a30-a040ac18eebd","Type":"ContainerStarted","Data":"0940c7aea401f92f6b87b35355fc8bb2b356dd83a404368f92cf4cfa1c5617b9"} Mar 13 14:50:03 crc kubenswrapper[4898]: I0313 14:50:03.186503 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556890-vl72z" event={"ID":"5795e677-fa9a-4235-9a30-a040ac18eebd","Type":"ContainerStarted","Data":"5a392b03ee567f4c66c2416d5b9e2dae2dde45c43076100867e0284651bd3e3b"} Mar 13 14:50:03 crc kubenswrapper[4898]: I0313 14:50:03.204569 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556890-vl72z" podStartSLOduration=1.6491670360000001 podStartE2EDuration="3.204546254s" podCreationTimestamp="2026-03-13 14:50:00 +0000 UTC" firstStartedPulling="2026-03-13 14:50:01.043833487 +0000 UTC m=+3236.045421726" lastFinishedPulling="2026-03-13 14:50:02.599212665 +0000 UTC m=+3237.600800944" observedRunningTime="2026-03-13 14:50:03.200631268 +0000 UTC m=+3238.202219517" watchObservedRunningTime="2026-03-13 14:50:03.204546254 +0000 UTC m=+3238.206134513" Mar 13 14:50:04 crc kubenswrapper[4898]: I0313 14:50:04.202014 4898 generic.go:334] "Generic (PLEG): container finished" podID="5795e677-fa9a-4235-9a30-a040ac18eebd" containerID="5a392b03ee567f4c66c2416d5b9e2dae2dde45c43076100867e0284651bd3e3b" exitCode=0 Mar 13 14:50:04 crc kubenswrapper[4898]: I0313 14:50:04.202150 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556890-vl72z" event={"ID":"5795e677-fa9a-4235-9a30-a040ac18eebd","Type":"ContainerDied","Data":"5a392b03ee567f4c66c2416d5b9e2dae2dde45c43076100867e0284651bd3e3b"} Mar 13 14:50:05 crc kubenswrapper[4898]: I0313 14:50:05.991261 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556890-vl72z" Mar 13 14:50:06 crc kubenswrapper[4898]: I0313 14:50:06.153634 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pgxx\" (UniqueName: \"kubernetes.io/projected/5795e677-fa9a-4235-9a30-a040ac18eebd-kube-api-access-9pgxx\") pod \"5795e677-fa9a-4235-9a30-a040ac18eebd\" (UID: \"5795e677-fa9a-4235-9a30-a040ac18eebd\") " Mar 13 14:50:06 crc kubenswrapper[4898]: I0313 14:50:06.161213 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5795e677-fa9a-4235-9a30-a040ac18eebd-kube-api-access-9pgxx" (OuterVolumeSpecName: "kube-api-access-9pgxx") pod "5795e677-fa9a-4235-9a30-a040ac18eebd" (UID: "5795e677-fa9a-4235-9a30-a040ac18eebd"). InnerVolumeSpecName "kube-api-access-9pgxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:50:06 crc kubenswrapper[4898]: I0313 14:50:06.237191 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556890-vl72z" event={"ID":"5795e677-fa9a-4235-9a30-a040ac18eebd","Type":"ContainerDied","Data":"0940c7aea401f92f6b87b35355fc8bb2b356dd83a404368f92cf4cfa1c5617b9"} Mar 13 14:50:06 crc kubenswrapper[4898]: I0313 14:50:06.237247 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0940c7aea401f92f6b87b35355fc8bb2b356dd83a404368f92cf4cfa1c5617b9" Mar 13 14:50:06 crc kubenswrapper[4898]: I0313 14:50:06.237339 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556890-vl72z" Mar 13 14:50:06 crc kubenswrapper[4898]: I0313 14:50:06.261483 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pgxx\" (UniqueName: \"kubernetes.io/projected/5795e677-fa9a-4235-9a30-a040ac18eebd-kube-api-access-9pgxx\") on node \"crc\" DevicePath \"\"" Mar 13 14:50:06 crc kubenswrapper[4898]: I0313 14:50:06.604287 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556884-wvc75"] Mar 13 14:50:06 crc kubenswrapper[4898]: I0313 14:50:06.624441 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556884-wvc75"] Mar 13 14:50:07 crc kubenswrapper[4898]: I0313 14:50:07.763568 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22d70d9e-a058-43a7-b692-19cd302d65ca" path="/var/lib/kubelet/pods/22d70d9e-a058-43a7-b692-19cd302d65ca/volumes" Mar 13 14:50:09 crc kubenswrapper[4898]: I0313 14:50:09.740588 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:50:09 crc kubenswrapper[4898]: E0313 14:50:09.742130 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:50:24 crc kubenswrapper[4898]: I0313 14:50:24.739980 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:50:25 crc kubenswrapper[4898]: I0313 14:50:25.591216 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"14bad097122012834e04d7733e2c56b7cb22ecdcbbe4dbb2cc5f1822098d6e46"} Mar 13 14:50:41 crc kubenswrapper[4898]: I0313 14:50:41.828189 4898 scope.go:117] "RemoveContainer" containerID="668ec92e00de6e908be7a4b238e021ba041b8ee50f571d510eea90e125398f41" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.203255 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556892-gg8mm"] Mar 13 14:52:00 crc kubenswrapper[4898]: E0313 14:52:00.204385 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5795e677-fa9a-4235-9a30-a040ac18eebd" containerName="oc" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.204401 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5795e677-fa9a-4235-9a30-a040ac18eebd" containerName="oc" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.204718 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5795e677-fa9a-4235-9a30-a040ac18eebd" containerName="oc" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.205772 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556892-gg8mm" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.211882 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.215259 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.215492 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.217759 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556892-gg8mm"] Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.316001 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt7z2\" (UniqueName: \"kubernetes.io/projected/bba89630-e09c-4d6d-b7c3-89aecad3889f-kube-api-access-wt7z2\") pod \"auto-csr-approver-29556892-gg8mm\" (UID: \"bba89630-e09c-4d6d-b7c3-89aecad3889f\") " pod="openshift-infra/auto-csr-approver-29556892-gg8mm" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.419137 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt7z2\" (UniqueName: \"kubernetes.io/projected/bba89630-e09c-4d6d-b7c3-89aecad3889f-kube-api-access-wt7z2\") pod \"auto-csr-approver-29556892-gg8mm\" (UID: \"bba89630-e09c-4d6d-b7c3-89aecad3889f\") " pod="openshift-infra/auto-csr-approver-29556892-gg8mm" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.446738 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt7z2\" (UniqueName: \"kubernetes.io/projected/bba89630-e09c-4d6d-b7c3-89aecad3889f-kube-api-access-wt7z2\") pod \"auto-csr-approver-29556892-gg8mm\" (UID: \"bba89630-e09c-4d6d-b7c3-89aecad3889f\") " pod="openshift-infra/auto-csr-approver-29556892-gg8mm" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.560965 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556892-gg8mm" Mar 13 14:52:01 crc kubenswrapper[4898]: I0313 14:52:01.020398 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556892-gg8mm"] Mar 13 14:52:01 crc kubenswrapper[4898]: I0313 14:52:01.336700 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556892-gg8mm" event={"ID":"bba89630-e09c-4d6d-b7c3-89aecad3889f","Type":"ContainerStarted","Data":"7d7b019ddbf4d1930bed24662580f143f3fdb04fdd12047fb71e8598751eaea3"} Mar 13 14:52:03 crc kubenswrapper[4898]: I0313 14:52:03.364260 4898 generic.go:334] "Generic (PLEG): container finished" podID="bba89630-e09c-4d6d-b7c3-89aecad3889f" containerID="168730dbf03823de4c3d331997904d8f68ac2eebf4c5ae0fee8108df4bd5aa88" exitCode=0 Mar 13 14:52:03 crc kubenswrapper[4898]: I0313 14:52:03.364488 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556892-gg8mm" event={"ID":"bba89630-e09c-4d6d-b7c3-89aecad3889f","Type":"ContainerDied","Data":"168730dbf03823de4c3d331997904d8f68ac2eebf4c5ae0fee8108df4bd5aa88"} Mar 13 14:52:04 crc kubenswrapper[4898]: I0313 14:52:04.922683 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556892-gg8mm" Mar 13 14:52:05 crc kubenswrapper[4898]: I0313 14:52:05.105200 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt7z2\" (UniqueName: \"kubernetes.io/projected/bba89630-e09c-4d6d-b7c3-89aecad3889f-kube-api-access-wt7z2\") pod \"bba89630-e09c-4d6d-b7c3-89aecad3889f\" (UID: \"bba89630-e09c-4d6d-b7c3-89aecad3889f\") " Mar 13 14:52:05 crc kubenswrapper[4898]: I0313 14:52:05.112385 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bba89630-e09c-4d6d-b7c3-89aecad3889f-kube-api-access-wt7z2" (OuterVolumeSpecName: "kube-api-access-wt7z2") pod "bba89630-e09c-4d6d-b7c3-89aecad3889f" (UID: "bba89630-e09c-4d6d-b7c3-89aecad3889f"). InnerVolumeSpecName "kube-api-access-wt7z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:52:05 crc kubenswrapper[4898]: I0313 14:52:05.210228 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt7z2\" (UniqueName: \"kubernetes.io/projected/bba89630-e09c-4d6d-b7c3-89aecad3889f-kube-api-access-wt7z2\") on node \"crc\" DevicePath \"\"" Mar 13 14:52:05 crc kubenswrapper[4898]: I0313 14:52:05.389527 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556892-gg8mm" event={"ID":"bba89630-e09c-4d6d-b7c3-89aecad3889f","Type":"ContainerDied","Data":"7d7b019ddbf4d1930bed24662580f143f3fdb04fdd12047fb71e8598751eaea3"} Mar 13 14:52:05 crc kubenswrapper[4898]: I0313 14:52:05.389584 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d7b019ddbf4d1930bed24662580f143f3fdb04fdd12047fb71e8598751eaea3" Mar 13 14:52:05 crc kubenswrapper[4898]: I0313 14:52:05.389632 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556892-gg8mm" Mar 13 14:52:06 crc kubenswrapper[4898]: I0313 14:52:06.006859 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556886-vxgm7"] Mar 13 14:52:06 crc kubenswrapper[4898]: I0313 14:52:06.020492 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556886-vxgm7"] Mar 13 14:52:07 crc kubenswrapper[4898]: I0313 14:52:07.763442 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e6f3996-1b26-4a53-8c2d-f74aa89ef944" path="/var/lib/kubelet/pods/7e6f3996-1b26-4a53-8c2d-f74aa89ef944/volumes" Mar 13 14:52:41 crc kubenswrapper[4898]: I0313 14:52:41.941354 4898 scope.go:117] "RemoveContainer" containerID="c50d215e83c79e2c4ba98ed3d21204bd33821f4c3e54e5173055aa94bae3e42e" Mar 13 14:52:49 crc kubenswrapper[4898]: I0313 14:52:49.134413 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:52:49 crc kubenswrapper[4898]: I0313 14:52:49.135052 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:53:19 crc kubenswrapper[4898]: I0313 14:53:19.134181 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:53:19 crc kubenswrapper[4898]: I0313 14:53:19.134622 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.134676 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.135294 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.135351 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.136133 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14bad097122012834e04d7733e2c56b7cb22ecdcbbe4dbb2cc5f1822098d6e46"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.136204 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://14bad097122012834e04d7733e2c56b7cb22ecdcbbe4dbb2cc5f1822098d6e46" gracePeriod=600 Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.698513 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="14bad097122012834e04d7733e2c56b7cb22ecdcbbe4dbb2cc5f1822098d6e46" exitCode=0 Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.698586 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"14bad097122012834e04d7733e2c56b7cb22ecdcbbe4dbb2cc5f1822098d6e46"} Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.699704 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da"} Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.699752 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.163989 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556894-4f5n2"] Mar 13 14:54:00 crc kubenswrapper[4898]: E0313 14:54:00.165582 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bba89630-e09c-4d6d-b7c3-89aecad3889f" containerName="oc" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.165779 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bba89630-e09c-4d6d-b7c3-89aecad3889f" containerName="oc" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.166408 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bba89630-e09c-4d6d-b7c3-89aecad3889f" containerName="oc" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.167883 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556894-4f5n2" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.170443 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.170613 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.173243 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.183572 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556894-4f5n2"] Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.243706 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jx9w\" (UniqueName: \"kubernetes.io/projected/d9bfc1e4-be1f-4495-a7de-2b4f94e901d8-kube-api-access-2jx9w\") pod \"auto-csr-approver-29556894-4f5n2\" (UID: \"d9bfc1e4-be1f-4495-a7de-2b4f94e901d8\") " pod="openshift-infra/auto-csr-approver-29556894-4f5n2" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.346091 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jx9w\" (UniqueName: \"kubernetes.io/projected/d9bfc1e4-be1f-4495-a7de-2b4f94e901d8-kube-api-access-2jx9w\") pod \"auto-csr-approver-29556894-4f5n2\" (UID: \"d9bfc1e4-be1f-4495-a7de-2b4f94e901d8\") " pod="openshift-infra/auto-csr-approver-29556894-4f5n2" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.370800 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jx9w\" (UniqueName: \"kubernetes.io/projected/d9bfc1e4-be1f-4495-a7de-2b4f94e901d8-kube-api-access-2jx9w\") pod \"auto-csr-approver-29556894-4f5n2\" (UID: \"d9bfc1e4-be1f-4495-a7de-2b4f94e901d8\") " pod="openshift-infra/auto-csr-approver-29556894-4f5n2" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.498309 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556894-4f5n2" Mar 13 14:54:01 crc kubenswrapper[4898]: I0313 14:54:01.038421 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556894-4f5n2"] Mar 13 14:54:01 crc kubenswrapper[4898]: I0313 14:54:01.897677 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556894-4f5n2" event={"ID":"d9bfc1e4-be1f-4495-a7de-2b4f94e901d8","Type":"ContainerStarted","Data":"9b2e72bacd3af95a2707b1008042c42f07b142a4c03b6c85b2fb65b533aed3e6"} Mar 13 14:54:02 crc kubenswrapper[4898]: I0313 14:54:02.907808 4898 generic.go:334] "Generic (PLEG): container finished" podID="d9bfc1e4-be1f-4495-a7de-2b4f94e901d8" containerID="508333f96d89e6fb34c9fb0fe392b0bcdb91535cabc45655a886e6d88f90fef5" exitCode=0 Mar 13 14:54:02 crc kubenswrapper[4898]: I0313 14:54:02.907960 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556894-4f5n2" event={"ID":"d9bfc1e4-be1f-4495-a7de-2b4f94e901d8","Type":"ContainerDied","Data":"508333f96d89e6fb34c9fb0fe392b0bcdb91535cabc45655a886e6d88f90fef5"} Mar 13 14:54:04 crc kubenswrapper[4898]: I0313 14:54:04.395868 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556894-4f5n2" Mar 13 14:54:04 crc kubenswrapper[4898]: I0313 14:54:04.448206 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jx9w\" (UniqueName: \"kubernetes.io/projected/d9bfc1e4-be1f-4495-a7de-2b4f94e901d8-kube-api-access-2jx9w\") pod \"d9bfc1e4-be1f-4495-a7de-2b4f94e901d8\" (UID: \"d9bfc1e4-be1f-4495-a7de-2b4f94e901d8\") " Mar 13 14:54:04 crc kubenswrapper[4898]: I0313 14:54:04.467322 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9bfc1e4-be1f-4495-a7de-2b4f94e901d8-kube-api-access-2jx9w" (OuterVolumeSpecName: "kube-api-access-2jx9w") pod "d9bfc1e4-be1f-4495-a7de-2b4f94e901d8" (UID: "d9bfc1e4-be1f-4495-a7de-2b4f94e901d8"). InnerVolumeSpecName "kube-api-access-2jx9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:54:04 crc kubenswrapper[4898]: I0313 14:54:04.556178 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jx9w\" (UniqueName: \"kubernetes.io/projected/d9bfc1e4-be1f-4495-a7de-2b4f94e901d8-kube-api-access-2jx9w\") on node \"crc\" DevicePath \"\"" Mar 13 14:54:04 crc kubenswrapper[4898]: I0313 14:54:04.935885 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556894-4f5n2" event={"ID":"d9bfc1e4-be1f-4495-a7de-2b4f94e901d8","Type":"ContainerDied","Data":"9b2e72bacd3af95a2707b1008042c42f07b142a4c03b6c85b2fb65b533aed3e6"} Mar 13 14:54:04 crc kubenswrapper[4898]: I0313 14:54:04.935981 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b2e72bacd3af95a2707b1008042c42f07b142a4c03b6c85b2fb65b533aed3e6" Mar 13 14:54:04 crc kubenswrapper[4898]: I0313 14:54:04.935999 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556894-4f5n2" Mar 13 14:54:05 crc kubenswrapper[4898]: I0313 14:54:05.467060 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556888-gn68v"] Mar 13 14:54:05 crc kubenswrapper[4898]: I0313 14:54:05.475326 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556888-gn68v"] Mar 13 14:54:05 crc kubenswrapper[4898]: I0313 14:54:05.765726 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2813f8b3-81f9-48a3-9a55-173dead5d7a7" path="/var/lib/kubelet/pods/2813f8b3-81f9-48a3-9a55-173dead5d7a7/volumes" Mar 13 14:54:32 crc kubenswrapper[4898]: I0313 14:54:32.803714 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kg5bc"] Mar 13 14:54:32 crc kubenswrapper[4898]: E0313 14:54:32.805012 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9bfc1e4-be1f-4495-a7de-2b4f94e901d8" containerName="oc" Mar 13 14:54:32 crc kubenswrapper[4898]: I0313 14:54:32.805033 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9bfc1e4-be1f-4495-a7de-2b4f94e901d8" containerName="oc" Mar 13 14:54:32 crc kubenswrapper[4898]: I0313 14:54:32.805319 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9bfc1e4-be1f-4495-a7de-2b4f94e901d8" containerName="oc" Mar 13 14:54:32 crc kubenswrapper[4898]: I0313 14:54:32.807473 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:32 crc kubenswrapper[4898]: I0313 14:54:32.816167 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kg5bc"] Mar 13 14:54:32 crc kubenswrapper[4898]: I0313 14:54:32.991282 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-utilities\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:32 crc kubenswrapper[4898]: I0313 14:54:32.992065 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-catalog-content\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:32 crc kubenswrapper[4898]: I0313 14:54:32.992212 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfgvn\" (UniqueName: \"kubernetes.io/projected/2447a834-934b-4e95-a373-2f98aa976716-kube-api-access-dfgvn\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:33 crc kubenswrapper[4898]: I0313 14:54:33.094325 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-catalog-content\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:33 crc kubenswrapper[4898]: I0313 14:54:33.094711 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfgvn\" (UniqueName: \"kubernetes.io/projected/2447a834-934b-4e95-a373-2f98aa976716-kube-api-access-dfgvn\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:33 crc kubenswrapper[4898]: I0313 14:54:33.094822 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-utilities\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:33 crc kubenswrapper[4898]: I0313 14:54:33.094980 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-catalog-content\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:33 crc kubenswrapper[4898]: I0313 14:54:33.095484 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-utilities\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:33 crc kubenswrapper[4898]: I0313 14:54:33.116833 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfgvn\" (UniqueName: \"kubernetes.io/projected/2447a834-934b-4e95-a373-2f98aa976716-kube-api-access-dfgvn\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:33 crc kubenswrapper[4898]: I0313 14:54:33.153858 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:33 crc kubenswrapper[4898]: I0313 14:54:33.663307 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kg5bc"] Mar 13 14:54:34 crc kubenswrapper[4898]: I0313 14:54:34.326834 4898 generic.go:334] "Generic (PLEG): container finished" podID="2447a834-934b-4e95-a373-2f98aa976716" containerID="211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922" exitCode=0 Mar 13 14:54:34 crc kubenswrapper[4898]: I0313 14:54:34.326933 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg5bc" event={"ID":"2447a834-934b-4e95-a373-2f98aa976716","Type":"ContainerDied","Data":"211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922"} Mar 13 14:54:34 crc kubenswrapper[4898]: I0313 14:54:34.327113 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg5bc" event={"ID":"2447a834-934b-4e95-a373-2f98aa976716","Type":"ContainerStarted","Data":"477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d"} Mar 13 14:54:34 crc kubenswrapper[4898]: I0313 14:54:34.329140 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:54:35 crc kubenswrapper[4898]: I0313 14:54:35.340531 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg5bc" event={"ID":"2447a834-934b-4e95-a373-2f98aa976716","Type":"ContainerStarted","Data":"f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558"} Mar 13 14:54:39 crc kubenswrapper[4898]: I0313 14:54:39.392330 4898 generic.go:334] "Generic (PLEG): container finished" podID="2447a834-934b-4e95-a373-2f98aa976716" containerID="f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558" exitCode=0 Mar 13 14:54:39 crc kubenswrapper[4898]: I0313 14:54:39.392715 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg5bc" event={"ID":"2447a834-934b-4e95-a373-2f98aa976716","Type":"ContainerDied","Data":"f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558"} Mar 13 14:54:40 crc kubenswrapper[4898]: I0313 14:54:40.410391 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg5bc" event={"ID":"2447a834-934b-4e95-a373-2f98aa976716","Type":"ContainerStarted","Data":"89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80"} Mar 13 14:54:40 crc kubenswrapper[4898]: I0313 14:54:40.444735 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kg5bc" podStartSLOduration=2.946857204 podStartE2EDuration="8.444713976s" podCreationTimestamp="2026-03-13 14:54:32 +0000 UTC" firstStartedPulling="2026-03-13 14:54:34.328856108 +0000 UTC m=+3509.330444347" lastFinishedPulling="2026-03-13 14:54:39.82671287 +0000 UTC m=+3514.828301119" observedRunningTime="2026-03-13 14:54:40.430205297 +0000 UTC m=+3515.431793546" watchObservedRunningTime="2026-03-13 14:54:40.444713976 +0000 UTC m=+3515.446302225" Mar 13 14:54:42 crc kubenswrapper[4898]: I0313 14:54:42.107395 4898 scope.go:117] "RemoveContainer" containerID="916c6a5f721d9ac559010e8682bf9fc9eec602069822f7f9c7c610579bc40a49" Mar 13 14:54:42 crc kubenswrapper[4898]: I0313 14:54:42.977125 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4b5mh"] Mar 13 14:54:42 crc kubenswrapper[4898]: I0313 14:54:42.980550 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:42 crc kubenswrapper[4898]: I0313 14:54:42.989503 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4b5mh"] Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.154755 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.154813 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.167284 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcsnj\" (UniqueName: \"kubernetes.io/projected/cd506070-c0f3-404f-9d20-fe9dd29cb86d-kube-api-access-wcsnj\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.167359 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-catalog-content\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.167400 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-utilities\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.269647 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-catalog-content\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.270070 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-catalog-content\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.270366 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-utilities\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.270160 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-utilities\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.271549 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcsnj\" (UniqueName: \"kubernetes.io/projected/cd506070-c0f3-404f-9d20-fe9dd29cb86d-kube-api-access-wcsnj\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.294725 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcsnj\" (UniqueName: \"kubernetes.io/projected/cd506070-c0f3-404f-9d20-fe9dd29cb86d-kube-api-access-wcsnj\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.307998 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.865049 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4b5mh"] Mar 13 14:54:44 crc kubenswrapper[4898]: I0313 14:54:44.206491 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kg5bc" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="registry-server" probeResult="failure" output=< Mar 13 14:54:44 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:54:44 crc kubenswrapper[4898]: > Mar 13 14:54:44 crc kubenswrapper[4898]: I0313 14:54:44.452923 4898 generic.go:334] "Generic (PLEG): container finished" podID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerID="b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9" exitCode=0 Mar 13 14:54:44 crc kubenswrapper[4898]: I0313 14:54:44.453013 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4b5mh" event={"ID":"cd506070-c0f3-404f-9d20-fe9dd29cb86d","Type":"ContainerDied","Data":"b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9"} Mar 13 14:54:44 crc kubenswrapper[4898]: I0313 14:54:44.453263 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4b5mh" event={"ID":"cd506070-c0f3-404f-9d20-fe9dd29cb86d","Type":"ContainerStarted","Data":"923a3f0dc6bb0acd6ca081740083507d0787a55d0525314456f8d61b9d354939"} Mar 13 14:54:45 crc kubenswrapper[4898]: I0313 14:54:45.467259 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4b5mh" event={"ID":"cd506070-c0f3-404f-9d20-fe9dd29cb86d","Type":"ContainerStarted","Data":"e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8"} Mar 13 14:54:47 crc kubenswrapper[4898]: I0313 14:54:47.502651 4898 generic.go:334] "Generic (PLEG): container finished" podID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerID="e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8" exitCode=0 Mar 13 14:54:47 crc kubenswrapper[4898]: I0313 14:54:47.502745 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4b5mh" event={"ID":"cd506070-c0f3-404f-9d20-fe9dd29cb86d","Type":"ContainerDied","Data":"e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8"} Mar 13 14:54:48 crc kubenswrapper[4898]: I0313 14:54:48.516667 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4b5mh" event={"ID":"cd506070-c0f3-404f-9d20-fe9dd29cb86d","Type":"ContainerStarted","Data":"5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c"} Mar 13 14:54:48 crc kubenswrapper[4898]: I0313 14:54:48.542325 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4b5mh" podStartSLOduration=2.83453774 podStartE2EDuration="6.542304109s" podCreationTimestamp="2026-03-13 14:54:42 +0000 UTC" firstStartedPulling="2026-03-13 14:54:44.456064264 +0000 UTC m=+3519.457652513" lastFinishedPulling="2026-03-13 14:54:48.163830633 +0000 UTC m=+3523.165418882" observedRunningTime="2026-03-13 14:54:48.537307445 +0000 UTC m=+3523.538895704" watchObservedRunningTime="2026-03-13 14:54:48.542304109 +0000 UTC m=+3523.543892348" Mar 13 14:54:53 crc kubenswrapper[4898]: I0313 14:54:53.309187 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:53 crc kubenswrapper[4898]: I0313 14:54:53.309727 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:53 crc kubenswrapper[4898]: I0313 14:54:53.383157 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:53 crc kubenswrapper[4898]: I0313 14:54:53.632418 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:54 crc kubenswrapper[4898]: I0313 14:54:54.251336 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kg5bc" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="registry-server" probeResult="failure" output=< Mar 13 14:54:54 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:54:54 crc kubenswrapper[4898]: > Mar 13 14:54:54 crc kubenswrapper[4898]: I0313 14:54:54.545916 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4b5mh"] Mar 13 14:54:55 crc kubenswrapper[4898]: I0313 14:54:55.620436 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4b5mh" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerName="registry-server" containerID="cri-o://5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c" gracePeriod=2 Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.219380 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.393102 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcsnj\" (UniqueName: \"kubernetes.io/projected/cd506070-c0f3-404f-9d20-fe9dd29cb86d-kube-api-access-wcsnj\") pod \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.393542 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-catalog-content\") pod \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.393681 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-utilities\") pod \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.394886 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-utilities" (OuterVolumeSpecName: "utilities") pod "cd506070-c0f3-404f-9d20-fe9dd29cb86d" (UID: "cd506070-c0f3-404f-9d20-fe9dd29cb86d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.406950 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd506070-c0f3-404f-9d20-fe9dd29cb86d-kube-api-access-wcsnj" (OuterVolumeSpecName: "kube-api-access-wcsnj") pod "cd506070-c0f3-404f-9d20-fe9dd29cb86d" (UID: "cd506070-c0f3-404f-9d20-fe9dd29cb86d"). InnerVolumeSpecName "kube-api-access-wcsnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.452678 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd506070-c0f3-404f-9d20-fe9dd29cb86d" (UID: "cd506070-c0f3-404f-9d20-fe9dd29cb86d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.496321 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcsnj\" (UniqueName: \"kubernetes.io/projected/cd506070-c0f3-404f-9d20-fe9dd29cb86d-kube-api-access-wcsnj\") on node \"crc\" DevicePath \"\"" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.496550 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.496583 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.631533 4898 generic.go:334] "Generic (PLEG): container finished" podID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerID="5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c" exitCode=0 Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.632602 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4b5mh" event={"ID":"cd506070-c0f3-404f-9d20-fe9dd29cb86d","Type":"ContainerDied","Data":"5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c"} Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.632694 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4b5mh" event={"ID":"cd506070-c0f3-404f-9d20-fe9dd29cb86d","Type":"ContainerDied","Data":"923a3f0dc6bb0acd6ca081740083507d0787a55d0525314456f8d61b9d354939"} Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.632765 4898 scope.go:117] "RemoveContainer" containerID="5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.632954 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.678152 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4b5mh"] Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.682421 4898 scope.go:117] "RemoveContainer" containerID="e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.687913 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4b5mh"] Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.710070 4898 scope.go:117] "RemoveContainer" containerID="b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.782792 4898 scope.go:117] "RemoveContainer" containerID="5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c" Mar 13 14:54:56 crc kubenswrapper[4898]: E0313 14:54:56.783302 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c\": container with ID starting with 5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c not found: ID does not exist" containerID="5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.783383 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c"} err="failed to get container status \"5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c\": rpc error: code = NotFound desc = could not find container \"5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c\": container with ID starting with 5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c not found: ID does not exist" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.783409 4898 scope.go:117] "RemoveContainer" containerID="e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8" Mar 13 14:54:56 crc kubenswrapper[4898]: E0313 14:54:56.783700 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8\": container with ID starting with e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8 not found: ID does not exist" containerID="e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.783718 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8"} err="failed to get container status \"e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8\": rpc error: code = NotFound desc = could not find container \"e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8\": container with ID starting with e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8 not found: ID does not exist" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.783730 4898 scope.go:117] "RemoveContainer" containerID="b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9" Mar 13 14:54:56 crc kubenswrapper[4898]: E0313 14:54:56.785105 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9\": container with ID starting with b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9 not found: ID does not exist" containerID="b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.785166 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9"} err="failed to get container status \"b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9\": rpc error: code = NotFound desc = could not find container \"b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9\": container with ID starting with b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9 not found: ID does not exist" Mar 13 14:54:57 crc kubenswrapper[4898]: I0313 14:54:57.757747 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" path="/var/lib/kubelet/pods/cd506070-c0f3-404f-9d20-fe9dd29cb86d/volumes" Mar 13 14:55:03 crc kubenswrapper[4898]: I0313 14:55:03.224593 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:55:03 crc kubenswrapper[4898]: I0313 14:55:03.294152 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:55:04 crc kubenswrapper[4898]: I0313 14:55:04.007340 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kg5bc"] Mar 13 14:55:04 crc kubenswrapper[4898]: I0313 14:55:04.731013 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kg5bc" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="registry-server" containerID="cri-o://89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80" gracePeriod=2 Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.217096 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.273738 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-catalog-content\") pod \"2447a834-934b-4e95-a373-2f98aa976716\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.273876 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfgvn\" (UniqueName: \"kubernetes.io/projected/2447a834-934b-4e95-a373-2f98aa976716-kube-api-access-dfgvn\") pod \"2447a834-934b-4e95-a373-2f98aa976716\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.279413 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2447a834-934b-4e95-a373-2f98aa976716-kube-api-access-dfgvn" (OuterVolumeSpecName: "kube-api-access-dfgvn") pod "2447a834-934b-4e95-a373-2f98aa976716" (UID: "2447a834-934b-4e95-a373-2f98aa976716"). InnerVolumeSpecName "kube-api-access-dfgvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.375881 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-utilities\") pod \"2447a834-934b-4e95-a373-2f98aa976716\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.376575 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfgvn\" (UniqueName: \"kubernetes.io/projected/2447a834-934b-4e95-a373-2f98aa976716-kube-api-access-dfgvn\") on node \"crc\" DevicePath \"\"" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.376686 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-utilities" (OuterVolumeSpecName: "utilities") pod "2447a834-934b-4e95-a373-2f98aa976716" (UID: "2447a834-934b-4e95-a373-2f98aa976716"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.415615 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2447a834-934b-4e95-a373-2f98aa976716" (UID: "2447a834-934b-4e95-a373-2f98aa976716"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.486410 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.486446 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.744594 4898 generic.go:334] "Generic (PLEG): container finished" podID="2447a834-934b-4e95-a373-2f98aa976716" containerID="89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80" exitCode=0 Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.749967 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.760357 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg5bc" event={"ID":"2447a834-934b-4e95-a373-2f98aa976716","Type":"ContainerDied","Data":"89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80"} Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.760433 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg5bc" event={"ID":"2447a834-934b-4e95-a373-2f98aa976716","Type":"ContainerDied","Data":"477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d"} Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.760480 4898 scope.go:117] "RemoveContainer" containerID="89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.804184 4898 scope.go:117] "RemoveContainer" containerID="f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.847497 4898 scope.go:117] "RemoveContainer" containerID="211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.848664 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kg5bc"] Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.862791 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kg5bc"] Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.896409 4898 scope.go:117] "RemoveContainer" containerID="89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80" Mar 13 14:55:05 crc kubenswrapper[4898]: E0313 14:55:05.896779 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80\": container with ID starting with 89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80 not found: ID does not exist" containerID="89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.896814 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80"} err="failed to get container status \"89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80\": rpc error: code = NotFound desc = could not find container \"89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80\": container with ID starting with 89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80 not found: ID does not exist" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.896837 4898 scope.go:117] "RemoveContainer" containerID="f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558" Mar 13 14:55:05 crc kubenswrapper[4898]: E0313 14:55:05.897068 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558\": container with ID starting with f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558 not found: ID does not exist" containerID="f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.897086 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558"} err="failed to get container status \"f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558\": rpc error: code = NotFound desc = could not find container \"f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558\": container with ID starting with f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558 not found: ID does not exist" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.897097 4898 scope.go:117] "RemoveContainer" containerID="211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922" Mar 13 14:55:05 crc kubenswrapper[4898]: E0313 14:55:05.898436 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922\": container with ID starting with 211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922 not found: ID does not exist" containerID="211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.898456 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922"} err="failed to get container status \"211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922\": rpc error: code = NotFound desc = could not find container \"211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922\": container with ID starting with 211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922 not found: ID does not exist" Mar 13 14:55:07 crc kubenswrapper[4898]: I0313 14:55:07.766595 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2447a834-934b-4e95-a373-2f98aa976716" path="/var/lib/kubelet/pods/2447a834-934b-4e95-a373-2f98aa976716/volumes" Mar 13 14:55:13 crc kubenswrapper[4898]: E0313 14:55:13.806561 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:14 crc kubenswrapper[4898]: E0313 14:55:14.997973 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:25 crc kubenswrapper[4898]: E0313 14:55:25.343364 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.507551 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sq9wv"] Mar 13 14:55:28 crc kubenswrapper[4898]: E0313 14:55:28.509743 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="extract-utilities" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.509797 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="extract-utilities" Mar 13 14:55:28 crc kubenswrapper[4898]: E0313 14:55:28.509883 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="registry-server" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.509977 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="registry-server" Mar 13 14:55:28 crc kubenswrapper[4898]: E0313 14:55:28.510015 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerName="extract-content" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.510034 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerName="extract-content" Mar 13 14:55:28 crc kubenswrapper[4898]: E0313 14:55:28.510093 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="extract-content" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.510114 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="extract-content" Mar 13 14:55:28 crc kubenswrapper[4898]: E0313 14:55:28.510140 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerName="registry-server" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.510157 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerName="registry-server" Mar 13 14:55:28 crc kubenswrapper[4898]: E0313 14:55:28.510206 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerName="extract-utilities" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.510226 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerName="extract-utilities" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.510861 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerName="registry-server" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.510971 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="registry-server" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.515428 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.524216 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sq9wv"] Mar 13 14:55:28 crc kubenswrapper[4898]: E0313 14:55:28.594787 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.641496 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-utilities\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.641590 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-catalog-content\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.642044 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ghrw\" (UniqueName: \"kubernetes.io/projected/e00d9a59-c597-436d-ab88-9b3ecdf169f5-kube-api-access-5ghrw\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.744152 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ghrw\" (UniqueName: \"kubernetes.io/projected/e00d9a59-c597-436d-ab88-9b3ecdf169f5-kube-api-access-5ghrw\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.744279 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-utilities\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.744350 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-catalog-content\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.744950 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-catalog-content\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.745243 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-utilities\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.765073 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ghrw\" (UniqueName: \"kubernetes.io/projected/e00d9a59-c597-436d-ab88-9b3ecdf169f5-kube-api-access-5ghrw\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.838433 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:29 crc kubenswrapper[4898]: I0313 14:55:29.393652 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sq9wv"] Mar 13 14:55:30 crc kubenswrapper[4898]: I0313 14:55:30.113664 4898 generic.go:334] "Generic (PLEG): container finished" podID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerID="49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10" exitCode=0 Mar 13 14:55:30 crc kubenswrapper[4898]: I0313 14:55:30.113931 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq9wv" event={"ID":"e00d9a59-c597-436d-ab88-9b3ecdf169f5","Type":"ContainerDied","Data":"49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10"} Mar 13 14:55:30 crc kubenswrapper[4898]: I0313 14:55:30.114073 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq9wv" event={"ID":"e00d9a59-c597-436d-ab88-9b3ecdf169f5","Type":"ContainerStarted","Data":"546d5d9a91a0f1ecdf4faca633fb37afd28045f8e171cb7c207f091fd2e86d03"} Mar 13 14:55:31 crc kubenswrapper[4898]: I0313 14:55:31.139788 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq9wv" event={"ID":"e00d9a59-c597-436d-ab88-9b3ecdf169f5","Type":"ContainerStarted","Data":"d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781"} Mar 13 14:55:32 crc kubenswrapper[4898]: I0313 14:55:32.158294 4898 generic.go:334] "Generic (PLEG): container finished" podID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerID="d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781" exitCode=0 Mar 13 14:55:32 crc kubenswrapper[4898]: I0313 14:55:32.158475 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq9wv" event={"ID":"e00d9a59-c597-436d-ab88-9b3ecdf169f5","Type":"ContainerDied","Data":"d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781"} Mar 13 14:55:33 crc kubenswrapper[4898]: I0313 14:55:33.172818 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq9wv" event={"ID":"e00d9a59-c597-436d-ab88-9b3ecdf169f5","Type":"ContainerStarted","Data":"21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5"} Mar 13 14:55:33 crc kubenswrapper[4898]: I0313 14:55:33.196539 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sq9wv" podStartSLOduration=2.636702494 podStartE2EDuration="5.196519284s" podCreationTimestamp="2026-03-13 14:55:28 +0000 UTC" firstStartedPulling="2026-03-13 14:55:30.117844886 +0000 UTC m=+3565.119433165" lastFinishedPulling="2026-03-13 14:55:32.677661676 +0000 UTC m=+3567.679249955" observedRunningTime="2026-03-13 14:55:33.191301044 +0000 UTC m=+3568.192889343" watchObservedRunningTime="2026-03-13 14:55:33.196519284 +0000 UTC m=+3568.198107533" Mar 13 14:55:35 crc kubenswrapper[4898]: E0313 14:55:35.758741 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:36 crc kubenswrapper[4898]: E0313 14:55:36.533341 4898 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:49182->38.102.83.201:43395: write tcp 38.102.83.201:49182->38.102.83.201:43395: write: broken pipe Mar 13 14:55:38 crc kubenswrapper[4898]: I0313 14:55:38.839225 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:38 crc kubenswrapper[4898]: I0313 14:55:38.839563 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:38 crc kubenswrapper[4898]: I0313 14:55:38.930371 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:39 crc kubenswrapper[4898]: I0313 14:55:39.355354 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:39 crc kubenswrapper[4898]: I0313 14:55:39.428000 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sq9wv"] Mar 13 14:55:41 crc kubenswrapper[4898]: I0313 14:55:41.291334 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sq9wv" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerName="registry-server" containerID="cri-o://21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5" gracePeriod=2 Mar 13 14:55:41 crc kubenswrapper[4898]: I0313 14:55:41.954568 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.085164 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ghrw\" (UniqueName: \"kubernetes.io/projected/e00d9a59-c597-436d-ab88-9b3ecdf169f5-kube-api-access-5ghrw\") pod \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.085834 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-utilities\") pod \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.085891 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-catalog-content\") pod \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.087048 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-utilities" (OuterVolumeSpecName: "utilities") pod "e00d9a59-c597-436d-ab88-9b3ecdf169f5" (UID: "e00d9a59-c597-436d-ab88-9b3ecdf169f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.087516 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.091742 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e00d9a59-c597-436d-ab88-9b3ecdf169f5-kube-api-access-5ghrw" (OuterVolumeSpecName: "kube-api-access-5ghrw") pod "e00d9a59-c597-436d-ab88-9b3ecdf169f5" (UID: "e00d9a59-c597-436d-ab88-9b3ecdf169f5"). InnerVolumeSpecName "kube-api-access-5ghrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.112043 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e00d9a59-c597-436d-ab88-9b3ecdf169f5" (UID: "e00d9a59-c597-436d-ab88-9b3ecdf169f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.189875 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ghrw\" (UniqueName: \"kubernetes.io/projected/e00d9a59-c597-436d-ab88-9b3ecdf169f5-kube-api-access-5ghrw\") on node \"crc\" DevicePath \"\"" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.189937 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.306737 4898 generic.go:334] "Generic (PLEG): container finished" podID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerID="21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5" exitCode=0 Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.306812 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq9wv" event={"ID":"e00d9a59-c597-436d-ab88-9b3ecdf169f5","Type":"ContainerDied","Data":"21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5"} Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.306844 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq9wv" event={"ID":"e00d9a59-c597-436d-ab88-9b3ecdf169f5","Type":"ContainerDied","Data":"546d5d9a91a0f1ecdf4faca633fb37afd28045f8e171cb7c207f091fd2e86d03"} Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.306867 4898 scope.go:117] "RemoveContainer" containerID="21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.309190 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.341320 4898 scope.go:117] "RemoveContainer" containerID="d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.368306 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sq9wv"] Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.378873 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sq9wv"] Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.391363 4898 scope.go:117] "RemoveContainer" containerID="49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.467892 4898 scope.go:117] "RemoveContainer" containerID="21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5" Mar 13 14:55:42 crc kubenswrapper[4898]: E0313 14:55:42.468637 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5\": container with ID starting with 21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5 not found: ID does not exist" containerID="21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.468841 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5"} err="failed to get container status \"21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5\": rpc error: code = NotFound desc = could not find container \"21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5\": container with ID starting with 21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5 not found: ID does not exist" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.469107 4898 scope.go:117] "RemoveContainer" containerID="d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781" Mar 13 14:55:42 crc kubenswrapper[4898]: E0313 14:55:42.469969 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781\": container with ID starting with d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781 not found: ID does not exist" containerID="d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.470033 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781"} err="failed to get container status \"d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781\": rpc error: code = NotFound desc = could not find container \"d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781\": container with ID starting with d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781 not found: ID does not exist" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.470075 4898 scope.go:117] "RemoveContainer" containerID="49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10" Mar 13 14:55:42 crc kubenswrapper[4898]: E0313 14:55:42.470627 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10\": container with ID starting with 49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10 not found: ID does not exist" containerID="49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.470811 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10"} err="failed to get container status \"49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10\": rpc error: code = NotFound desc = could not find container \"49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10\": container with ID starting with 49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10 not found: ID does not exist" Mar 13 14:55:43 crc kubenswrapper[4898]: I0313 14:55:43.770807 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" path="/var/lib/kubelet/pods/e00d9a59-c597-436d-ab88-9b3ecdf169f5/volumes" Mar 13 14:55:43 crc kubenswrapper[4898]: E0313 14:55:43.814168 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:45 crc kubenswrapper[4898]: E0313 14:55:45.817798 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:48 crc kubenswrapper[4898]: E0313 14:55:48.104731 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:48 crc kubenswrapper[4898]: E0313 14:55:48.105236 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:49 crc kubenswrapper[4898]: I0313 14:55:49.135115 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:55:49 crc kubenswrapper[4898]: I0313 14:55:49.135566 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:55:56 crc kubenswrapper[4898]: E0313 14:55:56.223927 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:58 crc kubenswrapper[4898]: E0313 14:55:58.603521 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.157296 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556896-tbs8b"] Mar 13 14:56:00 crc kubenswrapper[4898]: E0313 14:56:00.158041 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerName="registry-server" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.158055 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerName="registry-server" Mar 13 14:56:00 crc kubenswrapper[4898]: E0313 14:56:00.158073 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerName="extract-content" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.158079 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerName="extract-content" Mar 13 14:56:00 crc kubenswrapper[4898]: E0313 14:56:00.158092 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerName="extract-utilities" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.158098 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerName="extract-utilities" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.158315 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerName="registry-server" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.159240 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.161143 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.162153 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.162353 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.171313 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556896-tbs8b"] Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.249031 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh682\" (UniqueName: \"kubernetes.io/projected/ce9a8272-18eb-4001-a998-8e24fbe84593-kube-api-access-vh682\") pod \"auto-csr-approver-29556896-tbs8b\" (UID: \"ce9a8272-18eb-4001-a998-8e24fbe84593\") " pod="openshift-infra/auto-csr-approver-29556896-tbs8b" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.351716 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh682\" (UniqueName: \"kubernetes.io/projected/ce9a8272-18eb-4001-a998-8e24fbe84593-kube-api-access-vh682\") pod \"auto-csr-approver-29556896-tbs8b\" (UID: \"ce9a8272-18eb-4001-a998-8e24fbe84593\") " pod="openshift-infra/auto-csr-approver-29556896-tbs8b" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.374113 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh682\" (UniqueName: \"kubernetes.io/projected/ce9a8272-18eb-4001-a998-8e24fbe84593-kube-api-access-vh682\") pod \"auto-csr-approver-29556896-tbs8b\" (UID: \"ce9a8272-18eb-4001-a998-8e24fbe84593\") " pod="openshift-infra/auto-csr-approver-29556896-tbs8b" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.508273 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" Mar 13 14:56:00 crc kubenswrapper[4898]: W0313 14:56:00.997062 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce9a8272_18eb_4001_a998_8e24fbe84593.slice/crio-e99538b0e68903326ecd6d3b3eec3f5b8f18c544cc397a0d3d212e983bd1d18f WatchSource:0}: Error finding container e99538b0e68903326ecd6d3b3eec3f5b8f18c544cc397a0d3d212e983bd1d18f: Status 404 returned error can't find the container with id e99538b0e68903326ecd6d3b3eec3f5b8f18c544cc397a0d3d212e983bd1d18f Mar 13 14:56:01 crc kubenswrapper[4898]: I0313 14:56:01.011626 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556896-tbs8b"] Mar 13 14:56:01 crc kubenswrapper[4898]: I0313 14:56:01.623066 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" event={"ID":"ce9a8272-18eb-4001-a998-8e24fbe84593","Type":"ContainerStarted","Data":"e99538b0e68903326ecd6d3b3eec3f5b8f18c544cc397a0d3d212e983bd1d18f"} Mar 13 14:56:02 crc kubenswrapper[4898]: I0313 14:56:02.655883 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" event={"ID":"ce9a8272-18eb-4001-a998-8e24fbe84593","Type":"ContainerStarted","Data":"9eb62522f49fe5dd1b7c8d52fa260ebf6b039aa00a9cd1719cb7535e8637b27e"} Mar 13 14:56:02 crc kubenswrapper[4898]: I0313 14:56:02.685660 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" podStartSLOduration=1.623458466 podStartE2EDuration="2.685639527s" podCreationTimestamp="2026-03-13 14:56:00 +0000 UTC" firstStartedPulling="2026-03-13 14:56:00.999482122 +0000 UTC m=+3596.001070391" lastFinishedPulling="2026-03-13 14:56:02.061663173 +0000 UTC m=+3597.063251452" observedRunningTime="2026-03-13 14:56:02.673927027 +0000 UTC m=+3597.675515286" watchObservedRunningTime="2026-03-13 14:56:02.685639527 +0000 UTC m=+3597.687227776" Mar 13 14:56:03 crc kubenswrapper[4898]: I0313 14:56:03.674080 4898 generic.go:334] "Generic (PLEG): container finished" podID="ce9a8272-18eb-4001-a998-8e24fbe84593" containerID="9eb62522f49fe5dd1b7c8d52fa260ebf6b039aa00a9cd1719cb7535e8637b27e" exitCode=0 Mar 13 14:56:03 crc kubenswrapper[4898]: I0313 14:56:03.674186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" event={"ID":"ce9a8272-18eb-4001-a998-8e24fbe84593","Type":"ContainerDied","Data":"9eb62522f49fe5dd1b7c8d52fa260ebf6b039aa00a9cd1719cb7535e8637b27e"} Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.166134 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.296643 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh682\" (UniqueName: \"kubernetes.io/projected/ce9a8272-18eb-4001-a998-8e24fbe84593-kube-api-access-vh682\") pod \"ce9a8272-18eb-4001-a998-8e24fbe84593\" (UID: \"ce9a8272-18eb-4001-a998-8e24fbe84593\") " Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.302247 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce9a8272-18eb-4001-a998-8e24fbe84593-kube-api-access-vh682" (OuterVolumeSpecName: "kube-api-access-vh682") pod "ce9a8272-18eb-4001-a998-8e24fbe84593" (UID: "ce9a8272-18eb-4001-a998-8e24fbe84593"). InnerVolumeSpecName "kube-api-access-vh682". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.401321 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh682\" (UniqueName: \"kubernetes.io/projected/ce9a8272-18eb-4001-a998-8e24fbe84593-kube-api-access-vh682\") on node \"crc\" DevicePath \"\"" Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.701689 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" event={"ID":"ce9a8272-18eb-4001-a998-8e24fbe84593","Type":"ContainerDied","Data":"e99538b0e68903326ecd6d3b3eec3f5b8f18c544cc397a0d3d212e983bd1d18f"} Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.701729 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e99538b0e68903326ecd6d3b3eec3f5b8f18c544cc397a0d3d212e983bd1d18f" Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.702104 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.778321 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556890-vl72z"] Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.794508 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556890-vl72z"] Mar 13 14:56:07 crc kubenswrapper[4898]: I0313 14:56:07.778304 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5795e677-fa9a-4235-9a30-a040ac18eebd" path="/var/lib/kubelet/pods/5795e677-fa9a-4235-9a30-a040ac18eebd/volumes" Mar 13 14:56:19 crc kubenswrapper[4898]: I0313 14:56:19.137006 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:56:19 crc kubenswrapper[4898]: I0313 14:56:19.137525 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:56:35 crc kubenswrapper[4898]: E0313 14:56:35.240469 4898 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:59526->38.102.83.201:43395: write tcp 38.102.83.201:59526->38.102.83.201:43395: write: broken pipe Mar 13 14:56:42 crc kubenswrapper[4898]: I0313 14:56:42.281938 4898 scope.go:117] "RemoveContainer" containerID="5a392b03ee567f4c66c2416d5b9e2dae2dde45c43076100867e0284651bd3e3b" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.243927 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mlqzt"] Mar 13 14:56:45 crc kubenswrapper[4898]: E0313 14:56:45.245284 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce9a8272-18eb-4001-a998-8e24fbe84593" containerName="oc" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.245306 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce9a8272-18eb-4001-a998-8e24fbe84593" containerName="oc" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.245771 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce9a8272-18eb-4001-a998-8e24fbe84593" containerName="oc" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.250492 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.260753 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mlqzt"] Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.313019 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-catalog-content\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.313185 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6ghj\" (UniqueName: \"kubernetes.io/projected/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-kube-api-access-z6ghj\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.313328 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-utilities\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.415204 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-utilities\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.415276 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-catalog-content\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.415379 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6ghj\" (UniqueName: \"kubernetes.io/projected/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-kube-api-access-z6ghj\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.415747 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-utilities\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.415889 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-catalog-content\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.438079 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6ghj\" (UniqueName: \"kubernetes.io/projected/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-kube-api-access-z6ghj\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.595011 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:46 crc kubenswrapper[4898]: I0313 14:56:46.115694 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mlqzt"] Mar 13 14:56:46 crc kubenswrapper[4898]: I0313 14:56:46.300912 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlqzt" event={"ID":"363c1a15-d4ba-4ed1-bb98-74e3998bc48a","Type":"ContainerStarted","Data":"32e5424324d0692c123e1bbe46f2f60411b281878d1cfd5ff36cef1c344778f4"} Mar 13 14:56:47 crc kubenswrapper[4898]: I0313 14:56:47.317251 4898 generic.go:334] "Generic (PLEG): container finished" podID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerID="c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a" exitCode=0 Mar 13 14:56:47 crc kubenswrapper[4898]: I0313 14:56:47.317352 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlqzt" event={"ID":"363c1a15-d4ba-4ed1-bb98-74e3998bc48a","Type":"ContainerDied","Data":"c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a"} Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.134511 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.134843 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.134967 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.135991 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.136057 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" gracePeriod=600 Mar 13 14:56:49 crc kubenswrapper[4898]: E0313 14:56:49.307420 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.380623 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlqzt" event={"ID":"363c1a15-d4ba-4ed1-bb98-74e3998bc48a","Type":"ContainerStarted","Data":"c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181"} Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.383794 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" exitCode=0 Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.383827 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da"} Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.383890 4898 scope.go:117] "RemoveContainer" containerID="14bad097122012834e04d7733e2c56b7cb22ecdcbbe4dbb2cc5f1822098d6e46" Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.384852 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:56:49 crc kubenswrapper[4898]: E0313 14:56:49.385386 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:56:51 crc kubenswrapper[4898]: I0313 14:56:51.428411 4898 generic.go:334] "Generic (PLEG): container finished" podID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerID="c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181" exitCode=0 Mar 13 14:56:51 crc kubenswrapper[4898]: I0313 14:56:51.428583 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlqzt" event={"ID":"363c1a15-d4ba-4ed1-bb98-74e3998bc48a","Type":"ContainerDied","Data":"c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181"} Mar 13 14:56:52 crc kubenswrapper[4898]: I0313 14:56:52.448347 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlqzt" event={"ID":"363c1a15-d4ba-4ed1-bb98-74e3998bc48a","Type":"ContainerStarted","Data":"48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be"} Mar 13 14:56:52 crc kubenswrapper[4898]: I0313 14:56:52.472772 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mlqzt" podStartSLOduration=2.930406217 podStartE2EDuration="7.472744163s" podCreationTimestamp="2026-03-13 14:56:45 +0000 UTC" firstStartedPulling="2026-03-13 14:56:47.320780239 +0000 UTC m=+3642.322368508" lastFinishedPulling="2026-03-13 14:56:51.863118185 +0000 UTC m=+3646.864706454" observedRunningTime="2026-03-13 14:56:52.471568334 +0000 UTC m=+3647.473156643" watchObservedRunningTime="2026-03-13 14:56:52.472744163 +0000 UTC m=+3647.474332442" Mar 13 14:56:55 crc kubenswrapper[4898]: I0313 14:56:55.596271 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:55 crc kubenswrapper[4898]: I0313 14:56:55.596887 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:55 crc kubenswrapper[4898]: I0313 14:56:55.673113 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:57:03 crc kubenswrapper[4898]: I0313 14:57:03.740574 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:57:03 crc kubenswrapper[4898]: E0313 14:57:03.741623 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:57:05 crc kubenswrapper[4898]: I0313 14:57:05.662464 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:57:05 crc kubenswrapper[4898]: I0313 14:57:05.721828 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mlqzt"] Mar 13 14:57:06 crc kubenswrapper[4898]: I0313 14:57:06.639197 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mlqzt" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerName="registry-server" containerID="cri-o://48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be" gracePeriod=2 Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.262508 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.322224 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6ghj\" (UniqueName: \"kubernetes.io/projected/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-kube-api-access-z6ghj\") pod \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.322384 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-catalog-content\") pod \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.322570 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-utilities\") pod \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.323724 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-utilities" (OuterVolumeSpecName: "utilities") pod "363c1a15-d4ba-4ed1-bb98-74e3998bc48a" (UID: "363c1a15-d4ba-4ed1-bb98-74e3998bc48a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.324658 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.332079 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-kube-api-access-z6ghj" (OuterVolumeSpecName: "kube-api-access-z6ghj") pod "363c1a15-d4ba-4ed1-bb98-74e3998bc48a" (UID: "363c1a15-d4ba-4ed1-bb98-74e3998bc48a"). InnerVolumeSpecName "kube-api-access-z6ghj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.393250 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "363c1a15-d4ba-4ed1-bb98-74e3998bc48a" (UID: "363c1a15-d4ba-4ed1-bb98-74e3998bc48a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.427590 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.427642 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6ghj\" (UniqueName: \"kubernetes.io/projected/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-kube-api-access-z6ghj\") on node \"crc\" DevicePath \"\"" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.659965 4898 generic.go:334] "Generic (PLEG): container finished" podID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerID="48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be" exitCode=0 Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.660034 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlqzt" event={"ID":"363c1a15-d4ba-4ed1-bb98-74e3998bc48a","Type":"ContainerDied","Data":"48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be"} Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.660056 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.660087 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlqzt" event={"ID":"363c1a15-d4ba-4ed1-bb98-74e3998bc48a","Type":"ContainerDied","Data":"32e5424324d0692c123e1bbe46f2f60411b281878d1cfd5ff36cef1c344778f4"} Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.660116 4898 scope.go:117] "RemoveContainer" containerID="48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.713750 4898 scope.go:117] "RemoveContainer" containerID="c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.719295 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mlqzt"] Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.761549 4898 scope.go:117] "RemoveContainer" containerID="c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.764209 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mlqzt"] Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.806208 4898 scope.go:117] "RemoveContainer" containerID="48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be" Mar 13 14:57:07 crc kubenswrapper[4898]: E0313 14:57:07.806824 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be\": container with ID starting with 48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be not found: ID does not exist" containerID="48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.806874 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be"} err="failed to get container status \"48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be\": rpc error: code = NotFound desc = could not find container \"48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be\": container with ID starting with 48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be not found: ID does not exist" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.806925 4898 scope.go:117] "RemoveContainer" containerID="c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181" Mar 13 14:57:07 crc kubenswrapper[4898]: E0313 14:57:07.807465 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181\": container with ID starting with c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181 not found: ID does not exist" containerID="c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.807522 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181"} err="failed to get container status \"c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181\": rpc error: code = NotFound desc = could not find container \"c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181\": container with ID starting with c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181 not found: ID does not exist" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.807540 4898 scope.go:117] "RemoveContainer" containerID="c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a" Mar 13 14:57:07 crc kubenswrapper[4898]: E0313 14:57:07.807976 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a\": container with ID starting with c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a not found: ID does not exist" containerID="c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.808044 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a"} err="failed to get container status \"c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a\": rpc error: code = NotFound desc = could not find container \"c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a\": container with ID starting with c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a not found: ID does not exist" Mar 13 14:57:09 crc kubenswrapper[4898]: I0313 14:57:09.759218 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" path="/var/lib/kubelet/pods/363c1a15-d4ba-4ed1-bb98-74e3998bc48a/volumes" Mar 13 14:57:18 crc kubenswrapper[4898]: I0313 14:57:18.739475 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:57:18 crc kubenswrapper[4898]: E0313 14:57:18.740119 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:57:29 crc kubenswrapper[4898]: I0313 14:57:29.740736 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:57:29 crc kubenswrapper[4898]: E0313 14:57:29.741788 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:57:44 crc kubenswrapper[4898]: I0313 14:57:44.740689 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:57:44 crc kubenswrapper[4898]: E0313 14:57:44.742063 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:57:55 crc kubenswrapper[4898]: I0313 14:57:55.747157 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:57:55 crc kubenswrapper[4898]: E0313 14:57:55.747938 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.158616 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556898-zbc7v"] Mar 13 14:58:00 crc kubenswrapper[4898]: E0313 14:58:00.160364 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerName="extract-content" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.160395 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerName="extract-content" Mar 13 14:58:00 crc kubenswrapper[4898]: E0313 14:58:00.160430 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerName="registry-server" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.160465 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerName="registry-server" Mar 13 14:58:00 crc kubenswrapper[4898]: E0313 14:58:00.160576 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerName="extract-utilities" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.160595 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerName="extract-utilities" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.161298 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerName="registry-server" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.163136 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556898-zbc7v" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.165788 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.165944 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.167192 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.174505 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556898-zbc7v"] Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.295285 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25r7k\" (UniqueName: \"kubernetes.io/projected/2033726f-d64f-4989-8837-cec9738c8491-kube-api-access-25r7k\") pod \"auto-csr-approver-29556898-zbc7v\" (UID: \"2033726f-d64f-4989-8837-cec9738c8491\") " pod="openshift-infra/auto-csr-approver-29556898-zbc7v" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.398306 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25r7k\" (UniqueName: \"kubernetes.io/projected/2033726f-d64f-4989-8837-cec9738c8491-kube-api-access-25r7k\") pod \"auto-csr-approver-29556898-zbc7v\" (UID: \"2033726f-d64f-4989-8837-cec9738c8491\") " pod="openshift-infra/auto-csr-approver-29556898-zbc7v" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.423221 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25r7k\" (UniqueName: \"kubernetes.io/projected/2033726f-d64f-4989-8837-cec9738c8491-kube-api-access-25r7k\") pod \"auto-csr-approver-29556898-zbc7v\" (UID: \"2033726f-d64f-4989-8837-cec9738c8491\") " pod="openshift-infra/auto-csr-approver-29556898-zbc7v" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.489288 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556898-zbc7v" Mar 13 14:58:01 crc kubenswrapper[4898]: I0313 14:58:01.004057 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556898-zbc7v"] Mar 13 14:58:01 crc kubenswrapper[4898]: I0313 14:58:01.337194 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556898-zbc7v" event={"ID":"2033726f-d64f-4989-8837-cec9738c8491","Type":"ContainerStarted","Data":"4ea7cf1fedb5b1dbfe5701f105fc54d08b3deae9c2a5bd6e70725abdf840b05d"} Mar 13 14:58:03 crc kubenswrapper[4898]: I0313 14:58:03.360725 4898 generic.go:334] "Generic (PLEG): container finished" podID="2033726f-d64f-4989-8837-cec9738c8491" containerID="e1ed7a0b1ccbf119e01b0fbbab72ef967cfc5a8fef5c4bc80afb9d7eff1e70f1" exitCode=0 Mar 13 14:58:03 crc kubenswrapper[4898]: I0313 14:58:03.360835 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556898-zbc7v" event={"ID":"2033726f-d64f-4989-8837-cec9738c8491","Type":"ContainerDied","Data":"e1ed7a0b1ccbf119e01b0fbbab72ef967cfc5a8fef5c4bc80afb9d7eff1e70f1"} Mar 13 14:58:04 crc kubenswrapper[4898]: I0313 14:58:04.803030 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556898-zbc7v" Mar 13 14:58:04 crc kubenswrapper[4898]: I0313 14:58:04.815125 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25r7k\" (UniqueName: \"kubernetes.io/projected/2033726f-d64f-4989-8837-cec9738c8491-kube-api-access-25r7k\") pod \"2033726f-d64f-4989-8837-cec9738c8491\" (UID: \"2033726f-d64f-4989-8837-cec9738c8491\") " Mar 13 14:58:04 crc kubenswrapper[4898]: I0313 14:58:04.827671 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2033726f-d64f-4989-8837-cec9738c8491-kube-api-access-25r7k" (OuterVolumeSpecName: "kube-api-access-25r7k") pod "2033726f-d64f-4989-8837-cec9738c8491" (UID: "2033726f-d64f-4989-8837-cec9738c8491"). InnerVolumeSpecName "kube-api-access-25r7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:58:04 crc kubenswrapper[4898]: I0313 14:58:04.928443 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25r7k\" (UniqueName: \"kubernetes.io/projected/2033726f-d64f-4989-8837-cec9738c8491-kube-api-access-25r7k\") on node \"crc\" DevicePath \"\"" Mar 13 14:58:05 crc kubenswrapper[4898]: I0313 14:58:05.383704 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556898-zbc7v" event={"ID":"2033726f-d64f-4989-8837-cec9738c8491","Type":"ContainerDied","Data":"4ea7cf1fedb5b1dbfe5701f105fc54d08b3deae9c2a5bd6e70725abdf840b05d"} Mar 13 14:58:05 crc kubenswrapper[4898]: I0313 14:58:05.384223 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ea7cf1fedb5b1dbfe5701f105fc54d08b3deae9c2a5bd6e70725abdf840b05d" Mar 13 14:58:05 crc kubenswrapper[4898]: I0313 14:58:05.383761 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556898-zbc7v" Mar 13 14:58:05 crc kubenswrapper[4898]: I0313 14:58:05.916846 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556892-gg8mm"] Mar 13 14:58:05 crc kubenswrapper[4898]: I0313 14:58:05.934888 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556892-gg8mm"] Mar 13 14:58:06 crc kubenswrapper[4898]: I0313 14:58:06.828630 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:58:06 crc kubenswrapper[4898]: E0313 14:58:06.829228 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:58:07 crc kubenswrapper[4898]: I0313 14:58:07.756260 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bba89630-e09c-4d6d-b7c3-89aecad3889f" path="/var/lib/kubelet/pods/bba89630-e09c-4d6d-b7c3-89aecad3889f/volumes" Mar 13 14:58:17 crc kubenswrapper[4898]: I0313 14:58:17.740081 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:58:17 crc kubenswrapper[4898]: E0313 14:58:17.740821 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:58:30 crc kubenswrapper[4898]: I0313 14:58:30.740328 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:58:30 crc kubenswrapper[4898]: E0313 14:58:30.741134 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:58:42 crc kubenswrapper[4898]: I0313 14:58:42.456839 4898 scope.go:117] "RemoveContainer" containerID="168730dbf03823de4c3d331997904d8f68ac2eebf4c5ae0fee8108df4bd5aa88" Mar 13 14:58:45 crc kubenswrapper[4898]: I0313 14:58:45.746936 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:58:45 crc kubenswrapper[4898]: E0313 14:58:45.747697 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:58:57 crc kubenswrapper[4898]: I0313 14:58:57.740131 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:58:57 crc kubenswrapper[4898]: E0313 14:58:57.741064 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:59:11 crc kubenswrapper[4898]: I0313 14:59:11.739784 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:59:11 crc kubenswrapper[4898]: E0313 14:59:11.740673 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:59:24 crc kubenswrapper[4898]: I0313 14:59:24.740496 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:59:24 crc kubenswrapper[4898]: E0313 14:59:24.741554 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:59:38 crc kubenswrapper[4898]: I0313 14:59:38.740391 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:59:38 crc kubenswrapper[4898]: E0313 14:59:38.741314 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:59:49 crc kubenswrapper[4898]: I0313 14:59:49.740828 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:59:49 crc kubenswrapper[4898]: E0313 14:59:49.742545 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.157921 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556900-qqdxv"] Mar 13 15:00:00 crc kubenswrapper[4898]: E0313 15:00:00.159325 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2033726f-d64f-4989-8837-cec9738c8491" containerName="oc" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.159351 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2033726f-d64f-4989-8837-cec9738c8491" containerName="oc" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.159812 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2033726f-d64f-4989-8837-cec9738c8491" containerName="oc" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.161289 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.167594 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.167642 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.169509 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.169830 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck"] Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.171295 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.172577 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.173208 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.228324 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556900-qqdxv"] Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.234277 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2c20e5-035e-42e8-9767-4caf8f6381f3-secret-volume\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.234348 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx8tk\" (UniqueName: \"kubernetes.io/projected/8a2c20e5-035e-42e8-9767-4caf8f6381f3-kube-api-access-vx8tk\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.234410 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xxdp\" (UniqueName: \"kubernetes.io/projected/b0efa686-df70-493a-92dc-90db2ee67205-kube-api-access-6xxdp\") pod \"auto-csr-approver-29556900-qqdxv\" (UID: \"b0efa686-df70-493a-92dc-90db2ee67205\") " pod="openshift-infra/auto-csr-approver-29556900-qqdxv" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.234429 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2c20e5-035e-42e8-9767-4caf8f6381f3-config-volume\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.245371 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck"] Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.336263 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xxdp\" (UniqueName: \"kubernetes.io/projected/b0efa686-df70-493a-92dc-90db2ee67205-kube-api-access-6xxdp\") pod \"auto-csr-approver-29556900-qqdxv\" (UID: \"b0efa686-df70-493a-92dc-90db2ee67205\") " pod="openshift-infra/auto-csr-approver-29556900-qqdxv" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.336310 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2c20e5-035e-42e8-9767-4caf8f6381f3-config-volume\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.336473 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2c20e5-035e-42e8-9767-4caf8f6381f3-secret-volume\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.336535 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx8tk\" (UniqueName: \"kubernetes.io/projected/8a2c20e5-035e-42e8-9767-4caf8f6381f3-kube-api-access-vx8tk\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.338310 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2c20e5-035e-42e8-9767-4caf8f6381f3-config-volume\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.342360 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2c20e5-035e-42e8-9767-4caf8f6381f3-secret-volume\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.357533 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx8tk\" (UniqueName: \"kubernetes.io/projected/8a2c20e5-035e-42e8-9767-4caf8f6381f3-kube-api-access-vx8tk\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.362914 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xxdp\" (UniqueName: \"kubernetes.io/projected/b0efa686-df70-493a-92dc-90db2ee67205-kube-api-access-6xxdp\") pod \"auto-csr-approver-29556900-qqdxv\" (UID: \"b0efa686-df70-493a-92dc-90db2ee67205\") " pod="openshift-infra/auto-csr-approver-29556900-qqdxv" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.498817 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.513726 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:01 crc kubenswrapper[4898]: I0313 15:00:01.017416 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:00:01 crc kubenswrapper[4898]: I0313 15:00:01.018261 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556900-qqdxv"] Mar 13 15:00:01 crc kubenswrapper[4898]: I0313 15:00:01.168533 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck"] Mar 13 15:00:01 crc kubenswrapper[4898]: I0313 15:00:01.740764 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:00:01 crc kubenswrapper[4898]: E0313 15:00:01.741600 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:00:01 crc kubenswrapper[4898]: I0313 15:00:01.880297 4898 generic.go:334] "Generic (PLEG): container finished" podID="8a2c20e5-035e-42e8-9767-4caf8f6381f3" containerID="f85839693cc5404eb8f27038e41ba6dbfc5580a2e1e80c25856c6a29ee2aad6a" exitCode=0 Mar 13 15:00:01 crc kubenswrapper[4898]: I0313 15:00:01.880408 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" event={"ID":"8a2c20e5-035e-42e8-9767-4caf8f6381f3","Type":"ContainerDied","Data":"f85839693cc5404eb8f27038e41ba6dbfc5580a2e1e80c25856c6a29ee2aad6a"} Mar 13 15:00:01 crc kubenswrapper[4898]: I0313 15:00:01.880724 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" event={"ID":"8a2c20e5-035e-42e8-9767-4caf8f6381f3","Type":"ContainerStarted","Data":"c5820a7dca51c230241d9431be06a9a8a7d219e0fca2c08870b1e7964c87000b"} Mar 13 15:00:01 crc kubenswrapper[4898]: I0313 15:00:01.882405 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" event={"ID":"b0efa686-df70-493a-92dc-90db2ee67205","Type":"ContainerStarted","Data":"95ca6919ce9ba093b6a1dac4b4c3eccd147c30c4300aae3001ff16e6ba188e48"} Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.336362 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.417309 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2c20e5-035e-42e8-9767-4caf8f6381f3-config-volume\") pod \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.417380 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2c20e5-035e-42e8-9767-4caf8f6381f3-secret-volume\") pod \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.417503 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx8tk\" (UniqueName: \"kubernetes.io/projected/8a2c20e5-035e-42e8-9767-4caf8f6381f3-kube-api-access-vx8tk\") pod \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.418586 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a2c20e5-035e-42e8-9767-4caf8f6381f3-config-volume" (OuterVolumeSpecName: "config-volume") pod "8a2c20e5-035e-42e8-9767-4caf8f6381f3" (UID: "8a2c20e5-035e-42e8-9767-4caf8f6381f3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.424051 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a2c20e5-035e-42e8-9767-4caf8f6381f3-kube-api-access-vx8tk" (OuterVolumeSpecName: "kube-api-access-vx8tk") pod "8a2c20e5-035e-42e8-9767-4caf8f6381f3" (UID: "8a2c20e5-035e-42e8-9767-4caf8f6381f3"). InnerVolumeSpecName "kube-api-access-vx8tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.425036 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a2c20e5-035e-42e8-9767-4caf8f6381f3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8a2c20e5-035e-42e8-9767-4caf8f6381f3" (UID: "8a2c20e5-035e-42e8-9767-4caf8f6381f3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.520764 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2c20e5-035e-42e8-9767-4caf8f6381f3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.521064 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2c20e5-035e-42e8-9767-4caf8f6381f3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.521077 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx8tk\" (UniqueName: \"kubernetes.io/projected/8a2c20e5-035e-42e8-9767-4caf8f6381f3-kube-api-access-vx8tk\") on node \"crc\" DevicePath \"\"" Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.909226 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" event={"ID":"8a2c20e5-035e-42e8-9767-4caf8f6381f3","Type":"ContainerDied","Data":"c5820a7dca51c230241d9431be06a9a8a7d219e0fca2c08870b1e7964c87000b"} Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.909264 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5820a7dca51c230241d9431be06a9a8a7d219e0fca2c08870b1e7964c87000b" Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.909273 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:04 crc kubenswrapper[4898]: I0313 15:00:04.438019 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l"] Mar 13 15:00:04 crc kubenswrapper[4898]: I0313 15:00:04.452301 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l"] Mar 13 15:00:04 crc kubenswrapper[4898]: I0313 15:00:04.934372 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" event={"ID":"b0efa686-df70-493a-92dc-90db2ee67205","Type":"ContainerStarted","Data":"cc698733d50d55655a41f78a9335173ab8803d13e8dc8d8d6d62ee958bbac18b"} Mar 13 15:00:04 crc kubenswrapper[4898]: I0313 15:00:04.959146 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" podStartSLOduration=1.389308172 podStartE2EDuration="4.959126091s" podCreationTimestamp="2026-03-13 15:00:00 +0000 UTC" firstStartedPulling="2026-03-13 15:00:01.017206305 +0000 UTC m=+3836.018794534" lastFinishedPulling="2026-03-13 15:00:04.587024194 +0000 UTC m=+3839.588612453" observedRunningTime="2026-03-13 15:00:04.9457776 +0000 UTC m=+3839.947365849" watchObservedRunningTime="2026-03-13 15:00:04.959126091 +0000 UTC m=+3839.960714340" Mar 13 15:00:05 crc kubenswrapper[4898]: I0313 15:00:05.758816 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b296c2-5046-40b3-9fca-be350cf5de3e" path="/var/lib/kubelet/pods/d9b296c2-5046-40b3-9fca-be350cf5de3e/volumes" Mar 13 15:00:05 crc kubenswrapper[4898]: I0313 15:00:05.948250 4898 generic.go:334] "Generic (PLEG): container finished" podID="b0efa686-df70-493a-92dc-90db2ee67205" containerID="cc698733d50d55655a41f78a9335173ab8803d13e8dc8d8d6d62ee958bbac18b" exitCode=0 Mar 13 15:00:05 crc kubenswrapper[4898]: I0313 15:00:05.948342 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" event={"ID":"b0efa686-df70-493a-92dc-90db2ee67205","Type":"ContainerDied","Data":"cc698733d50d55655a41f78a9335173ab8803d13e8dc8d8d6d62ee958bbac18b"} Mar 13 15:00:07 crc kubenswrapper[4898]: I0313 15:00:07.378610 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" Mar 13 15:00:07 crc kubenswrapper[4898]: I0313 15:00:07.430163 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xxdp\" (UniqueName: \"kubernetes.io/projected/b0efa686-df70-493a-92dc-90db2ee67205-kube-api-access-6xxdp\") pod \"b0efa686-df70-493a-92dc-90db2ee67205\" (UID: \"b0efa686-df70-493a-92dc-90db2ee67205\") " Mar 13 15:00:07 crc kubenswrapper[4898]: I0313 15:00:07.453252 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0efa686-df70-493a-92dc-90db2ee67205-kube-api-access-6xxdp" (OuterVolumeSpecName: "kube-api-access-6xxdp") pod "b0efa686-df70-493a-92dc-90db2ee67205" (UID: "b0efa686-df70-493a-92dc-90db2ee67205"). InnerVolumeSpecName "kube-api-access-6xxdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:00:07 crc kubenswrapper[4898]: I0313 15:00:07.533093 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xxdp\" (UniqueName: \"kubernetes.io/projected/b0efa686-df70-493a-92dc-90db2ee67205-kube-api-access-6xxdp\") on node \"crc\" DevicePath \"\"" Mar 13 15:00:07 crc kubenswrapper[4898]: I0313 15:00:07.976838 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" event={"ID":"b0efa686-df70-493a-92dc-90db2ee67205","Type":"ContainerDied","Data":"95ca6919ce9ba093b6a1dac4b4c3eccd147c30c4300aae3001ff16e6ba188e48"} Mar 13 15:00:07 crc kubenswrapper[4898]: I0313 15:00:07.977144 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95ca6919ce9ba093b6a1dac4b4c3eccd147c30c4300aae3001ff16e6ba188e48" Mar 13 15:00:07 crc kubenswrapper[4898]: I0313 15:00:07.976929 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" Mar 13 15:00:08 crc kubenswrapper[4898]: I0313 15:00:08.030698 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556894-4f5n2"] Mar 13 15:00:08 crc kubenswrapper[4898]: I0313 15:00:08.044097 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556894-4f5n2"] Mar 13 15:00:09 crc kubenswrapper[4898]: I0313 15:00:09.756258 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9bfc1e4-be1f-4495-a7de-2b4f94e901d8" path="/var/lib/kubelet/pods/d9bfc1e4-be1f-4495-a7de-2b4f94e901d8/volumes" Mar 13 15:00:14 crc kubenswrapper[4898]: I0313 15:00:14.742182 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:00:14 crc kubenswrapper[4898]: E0313 15:00:14.743312 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:00:29 crc kubenswrapper[4898]: I0313 15:00:29.740618 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:00:29 crc kubenswrapper[4898]: E0313 15:00:29.742197 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:00:40 crc kubenswrapper[4898]: I0313 15:00:40.739616 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:00:40 crc kubenswrapper[4898]: E0313 15:00:40.740928 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:00:42 crc kubenswrapper[4898]: I0313 15:00:42.564179 4898 scope.go:117] "RemoveContainer" containerID="508333f96d89e6fb34c9fb0fe392b0bcdb91535cabc45655a886e6d88f90fef5" Mar 13 15:00:42 crc kubenswrapper[4898]: I0313 15:00:42.636255 4898 scope.go:117] "RemoveContainer" containerID="183f0c268935ae6820699911fc0be58b4d0e93db5e614c9661b0f4b96dcc6afd" Mar 13 15:00:51 crc kubenswrapper[4898]: I0313 15:00:51.739843 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:00:51 crc kubenswrapper[4898]: E0313 15:00:51.740952 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.188270 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29556901-6pnrd"] Mar 13 15:01:00 crc kubenswrapper[4898]: E0313 15:01:00.190015 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2c20e5-035e-42e8-9767-4caf8f6381f3" containerName="collect-profiles" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.190052 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2c20e5-035e-42e8-9767-4caf8f6381f3" containerName="collect-profiles" Mar 13 15:01:00 crc kubenswrapper[4898]: E0313 15:01:00.190113 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0efa686-df70-493a-92dc-90db2ee67205" containerName="oc" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.190132 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0efa686-df70-493a-92dc-90db2ee67205" containerName="oc" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.190651 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a2c20e5-035e-42e8-9767-4caf8f6381f3" containerName="collect-profiles" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.190713 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0efa686-df70-493a-92dc-90db2ee67205" containerName="oc" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.192781 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.210189 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29556901-6pnrd"] Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.298405 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-fernet-keys\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.298461 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-config-data\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.298494 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-combined-ca-bundle\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.298939 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6png\" (UniqueName: \"kubernetes.io/projected/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-kube-api-access-r6png\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.401277 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-fernet-keys\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.401327 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-config-data\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.401352 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-combined-ca-bundle\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.401462 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6png\" (UniqueName: \"kubernetes.io/projected/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-kube-api-access-r6png\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.410113 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-combined-ca-bundle\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.410144 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-config-data\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.410814 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-fernet-keys\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.424296 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6png\" (UniqueName: \"kubernetes.io/projected/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-kube-api-access-r6png\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.528829 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:01 crc kubenswrapper[4898]: I0313 15:01:01.078614 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29556901-6pnrd"] Mar 13 15:01:01 crc kubenswrapper[4898]: I0313 15:01:01.697440 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29556901-6pnrd" event={"ID":"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0","Type":"ContainerStarted","Data":"79a68b94993c8c73e536b1353fccc0391854becd695923f33f632c92842c2cda"} Mar 13 15:01:01 crc kubenswrapper[4898]: I0313 15:01:01.697749 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29556901-6pnrd" event={"ID":"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0","Type":"ContainerStarted","Data":"940d6aadcfa81d36e29a2fddf2d6c3158db5b07defb4474691f44c67a62308c8"} Mar 13 15:01:01 crc kubenswrapper[4898]: I0313 15:01:01.771071 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29556901-6pnrd" podStartSLOduration=1.771047975 podStartE2EDuration="1.771047975s" podCreationTimestamp="2026-03-13 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:01:01.711774885 +0000 UTC m=+3896.713363144" watchObservedRunningTime="2026-03-13 15:01:01.771047975 +0000 UTC m=+3896.772636224" Mar 13 15:01:04 crc kubenswrapper[4898]: I0313 15:01:04.735612 4898 generic.go:334] "Generic (PLEG): container finished" podID="3d71da57-c929-47d7-89bd-8e4e3c7f3ca0" containerID="79a68b94993c8c73e536b1353fccc0391854becd695923f33f632c92842c2cda" exitCode=0 Mar 13 15:01:04 crc kubenswrapper[4898]: I0313 15:01:04.735661 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29556901-6pnrd" event={"ID":"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0","Type":"ContainerDied","Data":"79a68b94993c8c73e536b1353fccc0391854becd695923f33f632c92842c2cda"} Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.410418 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.585745 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-fernet-keys\") pod \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.585825 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-config-data\") pod \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.585917 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6png\" (UniqueName: \"kubernetes.io/projected/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-kube-api-access-r6png\") pod \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.585951 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-combined-ca-bundle\") pod \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.592171 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-kube-api-access-r6png" (OuterVolumeSpecName: "kube-api-access-r6png") pod "3d71da57-c929-47d7-89bd-8e4e3c7f3ca0" (UID: "3d71da57-c929-47d7-89bd-8e4e3c7f3ca0"). InnerVolumeSpecName "kube-api-access-r6png". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.598037 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3d71da57-c929-47d7-89bd-8e4e3c7f3ca0" (UID: "3d71da57-c929-47d7-89bd-8e4e3c7f3ca0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.625458 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d71da57-c929-47d7-89bd-8e4e3c7f3ca0" (UID: "3d71da57-c929-47d7-89bd-8e4e3c7f3ca0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.674511 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-config-data" (OuterVolumeSpecName: "config-data") pod "3d71da57-c929-47d7-89bd-8e4e3c7f3ca0" (UID: "3d71da57-c929-47d7-89bd-8e4e3c7f3ca0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.689099 4898 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.689135 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.689145 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6png\" (UniqueName: \"kubernetes.io/projected/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-kube-api-access-r6png\") on node \"crc\" DevicePath \"\"" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.689154 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.739675 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:01:06 crc kubenswrapper[4898]: E0313 15:01:06.740215 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.786189 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29556901-6pnrd" event={"ID":"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0","Type":"ContainerDied","Data":"940d6aadcfa81d36e29a2fddf2d6c3158db5b07defb4474691f44c67a62308c8"} Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.786290 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="940d6aadcfa81d36e29a2fddf2d6c3158db5b07defb4474691f44c67a62308c8" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.786348 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:17 crc kubenswrapper[4898]: I0313 15:01:17.741262 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:01:17 crc kubenswrapper[4898]: E0313 15:01:17.742163 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:01:29 crc kubenswrapper[4898]: I0313 15:01:29.741151 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:01:29 crc kubenswrapper[4898]: E0313 15:01:29.742492 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:01:43 crc kubenswrapper[4898]: I0313 15:01:43.740153 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:01:43 crc kubenswrapper[4898]: E0313 15:01:43.741274 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:01:56 crc kubenswrapper[4898]: I0313 15:01:56.739640 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:01:57 crc kubenswrapper[4898]: I0313 15:01:57.479926 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"45a214eae7c7c7859e650d708f47b93bb385cc743f9e3e627bde30ab582e5ff5"} Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.151776 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556902-s4cg2"] Mar 13 15:02:00 crc kubenswrapper[4898]: E0313 15:02:00.153147 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d71da57-c929-47d7-89bd-8e4e3c7f3ca0" containerName="keystone-cron" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.153171 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d71da57-c929-47d7-89bd-8e4e3c7f3ca0" containerName="keystone-cron" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.153470 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d71da57-c929-47d7-89bd-8e4e3c7f3ca0" containerName="keystone-cron" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.154726 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556902-s4cg2" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.159205 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.159451 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.160842 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.171113 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556902-s4cg2"] Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.331238 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zbzm\" (UniqueName: \"kubernetes.io/projected/6d9d3379-4b7b-4263-aec1-10c06dc087e6-kube-api-access-2zbzm\") pod \"auto-csr-approver-29556902-s4cg2\" (UID: \"6d9d3379-4b7b-4263-aec1-10c06dc087e6\") " pod="openshift-infra/auto-csr-approver-29556902-s4cg2" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.433782 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zbzm\" (UniqueName: \"kubernetes.io/projected/6d9d3379-4b7b-4263-aec1-10c06dc087e6-kube-api-access-2zbzm\") pod \"auto-csr-approver-29556902-s4cg2\" (UID: \"6d9d3379-4b7b-4263-aec1-10c06dc087e6\") " pod="openshift-infra/auto-csr-approver-29556902-s4cg2" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.451956 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zbzm\" (UniqueName: \"kubernetes.io/projected/6d9d3379-4b7b-4263-aec1-10c06dc087e6-kube-api-access-2zbzm\") pod \"auto-csr-approver-29556902-s4cg2\" (UID: \"6d9d3379-4b7b-4263-aec1-10c06dc087e6\") " pod="openshift-infra/auto-csr-approver-29556902-s4cg2" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.488594 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556902-s4cg2" Mar 13 15:02:01 crc kubenswrapper[4898]: I0313 15:02:01.050129 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556902-s4cg2"] Mar 13 15:02:01 crc kubenswrapper[4898]: W0313 15:02:01.055026 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d9d3379_4b7b_4263_aec1_10c06dc087e6.slice/crio-36f61f81391e8819a4f39904fa835f78cd2cec2cabe581ad8c356235ff92f1e6 WatchSource:0}: Error finding container 36f61f81391e8819a4f39904fa835f78cd2cec2cabe581ad8c356235ff92f1e6: Status 404 returned error can't find the container with id 36f61f81391e8819a4f39904fa835f78cd2cec2cabe581ad8c356235ff92f1e6 Mar 13 15:02:01 crc kubenswrapper[4898]: I0313 15:02:01.520417 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556902-s4cg2" event={"ID":"6d9d3379-4b7b-4263-aec1-10c06dc087e6","Type":"ContainerStarted","Data":"36f61f81391e8819a4f39904fa835f78cd2cec2cabe581ad8c356235ff92f1e6"} Mar 13 15:02:03 crc kubenswrapper[4898]: I0313 15:02:03.547168 4898 generic.go:334] "Generic (PLEG): container finished" podID="6d9d3379-4b7b-4263-aec1-10c06dc087e6" containerID="e906ab6657ba9367ad2d982719bc7c2ae3c8f44a0ea2bd20f34b397b25a0a265" exitCode=0 Mar 13 15:02:03 crc kubenswrapper[4898]: I0313 15:02:03.547245 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556902-s4cg2" event={"ID":"6d9d3379-4b7b-4263-aec1-10c06dc087e6","Type":"ContainerDied","Data":"e906ab6657ba9367ad2d982719bc7c2ae3c8f44a0ea2bd20f34b397b25a0a265"} Mar 13 15:02:05 crc kubenswrapper[4898]: I0313 15:02:05.078260 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556902-s4cg2" Mar 13 15:02:05 crc kubenswrapper[4898]: I0313 15:02:05.170257 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zbzm\" (UniqueName: \"kubernetes.io/projected/6d9d3379-4b7b-4263-aec1-10c06dc087e6-kube-api-access-2zbzm\") pod \"6d9d3379-4b7b-4263-aec1-10c06dc087e6\" (UID: \"6d9d3379-4b7b-4263-aec1-10c06dc087e6\") " Mar 13 15:02:05 crc kubenswrapper[4898]: I0313 15:02:05.196042 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9d3379-4b7b-4263-aec1-10c06dc087e6-kube-api-access-2zbzm" (OuterVolumeSpecName: "kube-api-access-2zbzm") pod "6d9d3379-4b7b-4263-aec1-10c06dc087e6" (UID: "6d9d3379-4b7b-4263-aec1-10c06dc087e6"). InnerVolumeSpecName "kube-api-access-2zbzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:02:05 crc kubenswrapper[4898]: I0313 15:02:05.273775 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zbzm\" (UniqueName: \"kubernetes.io/projected/6d9d3379-4b7b-4263-aec1-10c06dc087e6-kube-api-access-2zbzm\") on node \"crc\" DevicePath \"\"" Mar 13 15:02:05 crc kubenswrapper[4898]: I0313 15:02:05.576152 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556902-s4cg2" event={"ID":"6d9d3379-4b7b-4263-aec1-10c06dc087e6","Type":"ContainerDied","Data":"36f61f81391e8819a4f39904fa835f78cd2cec2cabe581ad8c356235ff92f1e6"} Mar 13 15:02:05 crc kubenswrapper[4898]: I0313 15:02:05.576195 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36f61f81391e8819a4f39904fa835f78cd2cec2cabe581ad8c356235ff92f1e6" Mar 13 15:02:05 crc kubenswrapper[4898]: I0313 15:02:05.576223 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556902-s4cg2" Mar 13 15:02:06 crc kubenswrapper[4898]: I0313 15:02:06.163419 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556896-tbs8b"] Mar 13 15:02:06 crc kubenswrapper[4898]: I0313 15:02:06.174074 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556896-tbs8b"] Mar 13 15:02:07 crc kubenswrapper[4898]: I0313 15:02:07.766027 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce9a8272-18eb-4001-a998-8e24fbe84593" path="/var/lib/kubelet/pods/ce9a8272-18eb-4001-a998-8e24fbe84593/volumes" Mar 13 15:02:42 crc kubenswrapper[4898]: I0313 15:02:42.767074 4898 scope.go:117] "RemoveContainer" containerID="9eb62522f49fe5dd1b7c8d52fa260ebf6b039aa00a9cd1719cb7535e8637b27e" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.162680 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556904-69dtq"] Mar 13 15:04:00 crc kubenswrapper[4898]: E0313 15:04:00.164688 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9d3379-4b7b-4263-aec1-10c06dc087e6" containerName="oc" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.164782 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9d3379-4b7b-4263-aec1-10c06dc087e6" containerName="oc" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.165186 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d9d3379-4b7b-4263-aec1-10c06dc087e6" containerName="oc" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.166227 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556904-69dtq" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.169400 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.169774 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.170029 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.181155 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556904-69dtq"] Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.288083 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsz88\" (UniqueName: \"kubernetes.io/projected/a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0-kube-api-access-qsz88\") pod \"auto-csr-approver-29556904-69dtq\" (UID: \"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0\") " pod="openshift-infra/auto-csr-approver-29556904-69dtq" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.390518 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsz88\" (UniqueName: \"kubernetes.io/projected/a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0-kube-api-access-qsz88\") pod \"auto-csr-approver-29556904-69dtq\" (UID: \"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0\") " pod="openshift-infra/auto-csr-approver-29556904-69dtq" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.411145 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsz88\" (UniqueName: \"kubernetes.io/projected/a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0-kube-api-access-qsz88\") pod \"auto-csr-approver-29556904-69dtq\" (UID: \"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0\") " pod="openshift-infra/auto-csr-approver-29556904-69dtq" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.491676 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556904-69dtq" Mar 13 15:04:01 crc kubenswrapper[4898]: I0313 15:04:01.045174 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556904-69dtq"] Mar 13 15:04:01 crc kubenswrapper[4898]: I0313 15:04:01.939877 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556904-69dtq" event={"ID":"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0","Type":"ContainerStarted","Data":"008b1c64d4d8ea8f8815f261be80b2d3ddebe9bba1a289857558e6a036470c73"} Mar 13 15:04:02 crc kubenswrapper[4898]: I0313 15:04:02.951450 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556904-69dtq" event={"ID":"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0","Type":"ContainerStarted","Data":"cd5d0aba2324832ebc654e623ddb6b3b9935920ad75023c9232ebc2ff78ae2d3"} Mar 13 15:04:02 crc kubenswrapper[4898]: I0313 15:04:02.976905 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556904-69dtq" podStartSLOduration=1.910269677 podStartE2EDuration="2.976879095s" podCreationTimestamp="2026-03-13 15:04:00 +0000 UTC" firstStartedPulling="2026-03-13 15:04:01.046379605 +0000 UTC m=+4076.047967854" lastFinishedPulling="2026-03-13 15:04:02.112989033 +0000 UTC m=+4077.114577272" observedRunningTime="2026-03-13 15:04:02.969954254 +0000 UTC m=+4077.971542493" watchObservedRunningTime="2026-03-13 15:04:02.976879095 +0000 UTC m=+4077.978467334" Mar 13 15:04:03 crc kubenswrapper[4898]: I0313 15:04:03.965592 4898 generic.go:334] "Generic (PLEG): container finished" podID="a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0" containerID="cd5d0aba2324832ebc654e623ddb6b3b9935920ad75023c9232ebc2ff78ae2d3" exitCode=0 Mar 13 15:04:03 crc kubenswrapper[4898]: I0313 15:04:03.965823 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556904-69dtq" event={"ID":"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0","Type":"ContainerDied","Data":"cd5d0aba2324832ebc654e623ddb6b3b9935920ad75023c9232ebc2ff78ae2d3"} Mar 13 15:04:05 crc kubenswrapper[4898]: I0313 15:04:05.476365 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556904-69dtq" Mar 13 15:04:05 crc kubenswrapper[4898]: I0313 15:04:05.526353 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsz88\" (UniqueName: \"kubernetes.io/projected/a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0-kube-api-access-qsz88\") pod \"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0\" (UID: \"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0\") " Mar 13 15:04:05 crc kubenswrapper[4898]: I0313 15:04:05.539504 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0-kube-api-access-qsz88" (OuterVolumeSpecName: "kube-api-access-qsz88") pod "a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0" (UID: "a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0"). InnerVolumeSpecName "kube-api-access-qsz88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:05 crc kubenswrapper[4898]: I0313 15:04:05.630320 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsz88\" (UniqueName: \"kubernetes.io/projected/a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0-kube-api-access-qsz88\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:06 crc kubenswrapper[4898]: I0313 15:04:06.021988 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556904-69dtq" event={"ID":"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0","Type":"ContainerDied","Data":"008b1c64d4d8ea8f8815f261be80b2d3ddebe9bba1a289857558e6a036470c73"} Mar 13 15:04:06 crc kubenswrapper[4898]: I0313 15:04:06.022058 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="008b1c64d4d8ea8f8815f261be80b2d3ddebe9bba1a289857558e6a036470c73" Mar 13 15:04:06 crc kubenswrapper[4898]: I0313 15:04:06.022145 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556904-69dtq" Mar 13 15:04:06 crc kubenswrapper[4898]: I0313 15:04:06.067008 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556898-zbc7v"] Mar 13 15:04:06 crc kubenswrapper[4898]: I0313 15:04:06.079733 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556898-zbc7v"] Mar 13 15:04:07 crc kubenswrapper[4898]: I0313 15:04:07.761355 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2033726f-d64f-4989-8837-cec9738c8491" path="/var/lib/kubelet/pods/2033726f-d64f-4989-8837-cec9738c8491/volumes" Mar 13 15:04:19 crc kubenswrapper[4898]: I0313 15:04:19.134676 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:04:19 crc kubenswrapper[4898]: I0313 15:04:19.135320 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.362490 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8hn92"] Mar 13 15:04:42 crc kubenswrapper[4898]: E0313 15:04:42.363463 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0" containerName="oc" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.363479 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0" containerName="oc" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.363777 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0" containerName="oc" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.365799 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.377630 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hn92"] Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.478104 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqlmp\" (UniqueName: \"kubernetes.io/projected/c2015d1c-2da3-472c-b07f-3544037bda7b-kube-api-access-rqlmp\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.478189 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-catalog-content\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.478209 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-utilities\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.579977 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqlmp\" (UniqueName: \"kubernetes.io/projected/c2015d1c-2da3-472c-b07f-3544037bda7b-kube-api-access-rqlmp\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.580064 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-catalog-content\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.580085 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-utilities\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.581849 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-catalog-content\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.581884 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-utilities\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.607087 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqlmp\" (UniqueName: \"kubernetes.io/projected/c2015d1c-2da3-472c-b07f-3544037bda7b-kube-api-access-rqlmp\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.703452 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.903940 4898 scope.go:117] "RemoveContainer" containerID="e1ed7a0b1ccbf119e01b0fbbab72ef967cfc5a8fef5c4bc80afb9d7eff1e70f1" Mar 13 15:04:43 crc kubenswrapper[4898]: I0313 15:04:43.161915 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hn92"] Mar 13 15:04:43 crc kubenswrapper[4898]: I0313 15:04:43.497999 4898 generic.go:334] "Generic (PLEG): container finished" podID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerID="db7ae25c88961957319f284a4e8d7d972e2cbf22c41613242dfe639a76a9147c" exitCode=0 Mar 13 15:04:43 crc kubenswrapper[4898]: I0313 15:04:43.498044 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hn92" event={"ID":"c2015d1c-2da3-472c-b07f-3544037bda7b","Type":"ContainerDied","Data":"db7ae25c88961957319f284a4e8d7d972e2cbf22c41613242dfe639a76a9147c"} Mar 13 15:04:43 crc kubenswrapper[4898]: I0313 15:04:43.498076 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hn92" event={"ID":"c2015d1c-2da3-472c-b07f-3544037bda7b","Type":"ContainerStarted","Data":"6879d80bc6035cd951f8f9969f1be4bdabbd3f8ba652e1cb44bfc180caa2b879"} Mar 13 15:04:44 crc kubenswrapper[4898]: I0313 15:04:44.509430 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hn92" event={"ID":"c2015d1c-2da3-472c-b07f-3544037bda7b","Type":"ContainerStarted","Data":"7da033f322da5fa1fa86a5370629e3354569e0540a2c73e0b1b083de2cd620cf"} Mar 13 15:04:49 crc kubenswrapper[4898]: I0313 15:04:49.135399 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:04:49 crc kubenswrapper[4898]: I0313 15:04:49.135992 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:04:50 crc kubenswrapper[4898]: I0313 15:04:50.574312 4898 generic.go:334] "Generic (PLEG): container finished" podID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerID="7da033f322da5fa1fa86a5370629e3354569e0540a2c73e0b1b083de2cd620cf" exitCode=0 Mar 13 15:04:50 crc kubenswrapper[4898]: I0313 15:04:50.574442 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hn92" event={"ID":"c2015d1c-2da3-472c-b07f-3544037bda7b","Type":"ContainerDied","Data":"7da033f322da5fa1fa86a5370629e3354569e0540a2c73e0b1b083de2cd620cf"} Mar 13 15:04:51 crc kubenswrapper[4898]: I0313 15:04:51.589615 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hn92" event={"ID":"c2015d1c-2da3-472c-b07f-3544037bda7b","Type":"ContainerStarted","Data":"3cb68a76c0bb798950b1b0e18aeabaf2ca9ce4df1058a9db56fbaa44a5f14d1e"} Mar 13 15:04:51 crc kubenswrapper[4898]: I0313 15:04:51.627757 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8hn92" podStartSLOduration=2.017243809 podStartE2EDuration="9.627729488s" podCreationTimestamp="2026-03-13 15:04:42 +0000 UTC" firstStartedPulling="2026-03-13 15:04:43.499892184 +0000 UTC m=+4118.501480423" lastFinishedPulling="2026-03-13 15:04:51.110377863 +0000 UTC m=+4126.111966102" observedRunningTime="2026-03-13 15:04:51.609018966 +0000 UTC m=+4126.610607215" watchObservedRunningTime="2026-03-13 15:04:51.627729488 +0000 UTC m=+4126.629317747" Mar 13 15:04:52 crc kubenswrapper[4898]: I0313 15:04:52.703983 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:52 crc kubenswrapper[4898]: I0313 15:04:52.704250 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:53 crc kubenswrapper[4898]: I0313 15:04:53.761794 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8hn92" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="registry-server" probeResult="failure" output=< Mar 13 15:04:53 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:04:53 crc kubenswrapper[4898]: > Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.594280 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vqvc5"] Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.597690 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.619486 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqvc5"] Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.677777 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-utilities\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.678220 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-catalog-content\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.678799 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxv6z\" (UniqueName: \"kubernetes.io/projected/b810f672-a1b5-434f-a031-0044957eebda-kube-api-access-lxv6z\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.759199 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.788985 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-catalog-content\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.789474 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxv6z\" (UniqueName: \"kubernetes.io/projected/b810f672-a1b5-434f-a031-0044957eebda-kube-api-access-lxv6z\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.789867 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-utilities\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.794431 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-catalog-content\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.794585 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-utilities\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.836333 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.849016 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxv6z\" (UniqueName: \"kubernetes.io/projected/b810f672-a1b5-434f-a031-0044957eebda-kube-api-access-lxv6z\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.946210 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:03 crc kubenswrapper[4898]: I0313 15:05:03.699220 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqvc5"] Mar 13 15:05:03 crc kubenswrapper[4898]: I0313 15:05:03.724561 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqvc5" event={"ID":"b810f672-a1b5-434f-a031-0044957eebda","Type":"ContainerStarted","Data":"736696e3e2832f751d0808f40db57e7195351948569adaeaac99ff9a6c9bc2af"} Mar 13 15:05:04 crc kubenswrapper[4898]: I0313 15:05:04.734568 4898 generic.go:334] "Generic (PLEG): container finished" podID="b810f672-a1b5-434f-a031-0044957eebda" containerID="047cf2dc69ea2109aa3c486de3d364b7f183dce862049d2fa79df29daa715c7f" exitCode=0 Mar 13 15:05:04 crc kubenswrapper[4898]: I0313 15:05:04.734740 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqvc5" event={"ID":"b810f672-a1b5-434f-a031-0044957eebda","Type":"ContainerDied","Data":"047cf2dc69ea2109aa3c486de3d364b7f183dce862049d2fa79df29daa715c7f"} Mar 13 15:05:04 crc kubenswrapper[4898]: I0313 15:05:04.736822 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:05:05 crc kubenswrapper[4898]: I0313 15:05:05.154931 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8hn92"] Mar 13 15:05:05 crc kubenswrapper[4898]: I0313 15:05:05.155420 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8hn92" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="registry-server" containerID="cri-o://3cb68a76c0bb798950b1b0e18aeabaf2ca9ce4df1058a9db56fbaa44a5f14d1e" gracePeriod=2 Mar 13 15:05:05 crc kubenswrapper[4898]: I0313 15:05:05.758324 4898 generic.go:334] "Generic (PLEG): container finished" podID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerID="3cb68a76c0bb798950b1b0e18aeabaf2ca9ce4df1058a9db56fbaa44a5f14d1e" exitCode=0 Mar 13 15:05:05 crc kubenswrapper[4898]: I0313 15:05:05.760563 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hn92" event={"ID":"c2015d1c-2da3-472c-b07f-3544037bda7b","Type":"ContainerDied","Data":"3cb68a76c0bb798950b1b0e18aeabaf2ca9ce4df1058a9db56fbaa44a5f14d1e"} Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.383419 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.478956 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqlmp\" (UniqueName: \"kubernetes.io/projected/c2015d1c-2da3-472c-b07f-3544037bda7b-kube-api-access-rqlmp\") pod \"c2015d1c-2da3-472c-b07f-3544037bda7b\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.479059 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-utilities\") pod \"c2015d1c-2da3-472c-b07f-3544037bda7b\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.479122 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-catalog-content\") pod \"c2015d1c-2da3-472c-b07f-3544037bda7b\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.480937 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-utilities" (OuterVolumeSpecName: "utilities") pod "c2015d1c-2da3-472c-b07f-3544037bda7b" (UID: "c2015d1c-2da3-472c-b07f-3544037bda7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.485851 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2015d1c-2da3-472c-b07f-3544037bda7b-kube-api-access-rqlmp" (OuterVolumeSpecName: "kube-api-access-rqlmp") pod "c2015d1c-2da3-472c-b07f-3544037bda7b" (UID: "c2015d1c-2da3-472c-b07f-3544037bda7b"). InnerVolumeSpecName "kube-api-access-rqlmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.581168 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqlmp\" (UniqueName: \"kubernetes.io/projected/c2015d1c-2da3-472c-b07f-3544037bda7b-kube-api-access-rqlmp\") on node \"crc\" DevicePath \"\"" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.581421 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.620092 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2015d1c-2da3-472c-b07f-3544037bda7b" (UID: "c2015d1c-2da3-472c-b07f-3544037bda7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.685009 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.772073 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hn92" event={"ID":"c2015d1c-2da3-472c-b07f-3544037bda7b","Type":"ContainerDied","Data":"6879d80bc6035cd951f8f9969f1be4bdabbd3f8ba652e1cb44bfc180caa2b879"} Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.772131 4898 scope.go:117] "RemoveContainer" containerID="3cb68a76c0bb798950b1b0e18aeabaf2ca9ce4df1058a9db56fbaa44a5f14d1e" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.772290 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.830062 4898 scope.go:117] "RemoveContainer" containerID="7da033f322da5fa1fa86a5370629e3354569e0540a2c73e0b1b083de2cd620cf" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.843840 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8hn92"] Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.858876 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8hn92"] Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.937070 4898 scope.go:117] "RemoveContainer" containerID="db7ae25c88961957319f284a4e8d7d972e2cbf22c41613242dfe639a76a9147c" Mar 13 15:05:07 crc kubenswrapper[4898]: I0313 15:05:07.763451 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" path="/var/lib/kubelet/pods/c2015d1c-2da3-472c-b07f-3544037bda7b/volumes" Mar 13 15:05:08 crc kubenswrapper[4898]: I0313 15:05:08.809250 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqvc5" event={"ID":"b810f672-a1b5-434f-a031-0044957eebda","Type":"ContainerStarted","Data":"fc635bce94b6959b5176764a3e3eed24166f3f7903704ae4383b8bd39ab0ccdc"} Mar 13 15:05:11 crc kubenswrapper[4898]: I0313 15:05:11.842999 4898 generic.go:334] "Generic (PLEG): container finished" podID="b810f672-a1b5-434f-a031-0044957eebda" containerID="fc635bce94b6959b5176764a3e3eed24166f3f7903704ae4383b8bd39ab0ccdc" exitCode=0 Mar 13 15:05:11 crc kubenswrapper[4898]: I0313 15:05:11.843605 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqvc5" event={"ID":"b810f672-a1b5-434f-a031-0044957eebda","Type":"ContainerDied","Data":"fc635bce94b6959b5176764a3e3eed24166f3f7903704ae4383b8bd39ab0ccdc"} Mar 13 15:05:13 crc kubenswrapper[4898]: I0313 15:05:13.866439 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqvc5" event={"ID":"b810f672-a1b5-434f-a031-0044957eebda","Type":"ContainerStarted","Data":"fa0139e782cd3c4bbf091849cf8ad406e30cf1ffee2cf32b66a02f6f8c182df0"} Mar 13 15:05:13 crc kubenswrapper[4898]: I0313 15:05:13.890217 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vqvc5" podStartSLOduration=4.073507514 podStartE2EDuration="11.890189844s" podCreationTimestamp="2026-03-13 15:05:02 +0000 UTC" firstStartedPulling="2026-03-13 15:05:04.736502268 +0000 UTC m=+4139.738090517" lastFinishedPulling="2026-03-13 15:05:12.553184598 +0000 UTC m=+4147.554772847" observedRunningTime="2026-03-13 15:05:13.885844817 +0000 UTC m=+4148.887433096" watchObservedRunningTime="2026-03-13 15:05:13.890189844 +0000 UTC m=+4148.891778113" Mar 13 15:05:19 crc kubenswrapper[4898]: I0313 15:05:19.134060 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:05:19 crc kubenswrapper[4898]: I0313 15:05:19.134735 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:05:19 crc kubenswrapper[4898]: I0313 15:05:19.134802 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 15:05:19 crc kubenswrapper[4898]: I0313 15:05:19.136195 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45a214eae7c7c7859e650d708f47b93bb385cc743f9e3e627bde30ab582e5ff5"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:05:19 crc kubenswrapper[4898]: I0313 15:05:19.136306 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://45a214eae7c7c7859e650d708f47b93bb385cc743f9e3e627bde30ab582e5ff5" gracePeriod=600 Mar 13 15:05:20 crc kubenswrapper[4898]: I0313 15:05:20.959295 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="45a214eae7c7c7859e650d708f47b93bb385cc743f9e3e627bde30ab582e5ff5" exitCode=0 Mar 13 15:05:20 crc kubenswrapper[4898]: I0313 15:05:20.959363 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"45a214eae7c7c7859e650d708f47b93bb385cc743f9e3e627bde30ab582e5ff5"} Mar 13 15:05:20 crc kubenswrapper[4898]: I0313 15:05:20.959757 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a"} Mar 13 15:05:20 crc kubenswrapper[4898]: I0313 15:05:20.959781 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:05:22 crc kubenswrapper[4898]: I0313 15:05:22.947112 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:22 crc kubenswrapper[4898]: I0313 15:05:22.947671 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:23 crc kubenswrapper[4898]: I0313 15:05:23.391372 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:23 crc kubenswrapper[4898]: I0313 15:05:23.475505 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:23 crc kubenswrapper[4898]: I0313 15:05:23.657364 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqvc5"] Mar 13 15:05:25 crc kubenswrapper[4898]: I0313 15:05:25.008245 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vqvc5" podUID="b810f672-a1b5-434f-a031-0044957eebda" containerName="registry-server" containerID="cri-o://fa0139e782cd3c4bbf091849cf8ad406e30cf1ffee2cf32b66a02f6f8c182df0" gracePeriod=2 Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.031450 4898 generic.go:334] "Generic (PLEG): container finished" podID="b810f672-a1b5-434f-a031-0044957eebda" containerID="fa0139e782cd3c4bbf091849cf8ad406e30cf1ffee2cf32b66a02f6f8c182df0" exitCode=0 Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.031546 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqvc5" event={"ID":"b810f672-a1b5-434f-a031-0044957eebda","Type":"ContainerDied","Data":"fa0139e782cd3c4bbf091849cf8ad406e30cf1ffee2cf32b66a02f6f8c182df0"} Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.321421 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.439577 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-utilities\") pod \"b810f672-a1b5-434f-a031-0044957eebda\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.440061 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-catalog-content\") pod \"b810f672-a1b5-434f-a031-0044957eebda\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.440095 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxv6z\" (UniqueName: \"kubernetes.io/projected/b810f672-a1b5-434f-a031-0044957eebda-kube-api-access-lxv6z\") pod \"b810f672-a1b5-434f-a031-0044957eebda\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.440439 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-utilities" (OuterVolumeSpecName: "utilities") pod "b810f672-a1b5-434f-a031-0044957eebda" (UID: "b810f672-a1b5-434f-a031-0044957eebda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.442201 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.447439 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b810f672-a1b5-434f-a031-0044957eebda-kube-api-access-lxv6z" (OuterVolumeSpecName: "kube-api-access-lxv6z") pod "b810f672-a1b5-434f-a031-0044957eebda" (UID: "b810f672-a1b5-434f-a031-0044957eebda"). InnerVolumeSpecName "kube-api-access-lxv6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.504227 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b810f672-a1b5-434f-a031-0044957eebda" (UID: "b810f672-a1b5-434f-a031-0044957eebda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.543812 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.543851 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxv6z\" (UniqueName: \"kubernetes.io/projected/b810f672-a1b5-434f-a031-0044957eebda-kube-api-access-lxv6z\") on node \"crc\" DevicePath \"\"" Mar 13 15:05:27 crc kubenswrapper[4898]: I0313 15:05:27.048336 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqvc5" event={"ID":"b810f672-a1b5-434f-a031-0044957eebda","Type":"ContainerDied","Data":"736696e3e2832f751d0808f40db57e7195351948569adaeaac99ff9a6c9bc2af"} Mar 13 15:05:27 crc kubenswrapper[4898]: I0313 15:05:27.048409 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:27 crc kubenswrapper[4898]: I0313 15:05:27.048677 4898 scope.go:117] "RemoveContainer" containerID="fa0139e782cd3c4bbf091849cf8ad406e30cf1ffee2cf32b66a02f6f8c182df0" Mar 13 15:05:27 crc kubenswrapper[4898]: I0313 15:05:27.079054 4898 scope.go:117] "RemoveContainer" containerID="fc635bce94b6959b5176764a3e3eed24166f3f7903704ae4383b8bd39ab0ccdc" Mar 13 15:05:27 crc kubenswrapper[4898]: I0313 15:05:27.112616 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqvc5"] Mar 13 15:05:27 crc kubenswrapper[4898]: I0313 15:05:27.121295 4898 scope.go:117] "RemoveContainer" containerID="047cf2dc69ea2109aa3c486de3d364b7f183dce862049d2fa79df29daa715c7f" Mar 13 15:05:27 crc kubenswrapper[4898]: I0313 15:05:27.141009 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vqvc5"] Mar 13 15:05:27 crc kubenswrapper[4898]: I0313 15:05:27.760308 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b810f672-a1b5-434f-a031-0044957eebda" path="/var/lib/kubelet/pods/b810f672-a1b5-434f-a031-0044957eebda/volumes" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.162270 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556906-m4tq8"] Mar 13 15:06:00 crc kubenswrapper[4898]: E0313 15:06:00.163646 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="extract-utilities" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.163670 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="extract-utilities" Mar 13 15:06:00 crc kubenswrapper[4898]: E0313 15:06:00.163720 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b810f672-a1b5-434f-a031-0044957eebda" containerName="extract-utilities" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.163733 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b810f672-a1b5-434f-a031-0044957eebda" containerName="extract-utilities" Mar 13 15:06:00 crc kubenswrapper[4898]: E0313 15:06:00.163759 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b810f672-a1b5-434f-a031-0044957eebda" containerName="extract-content" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.163769 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b810f672-a1b5-434f-a031-0044957eebda" containerName="extract-content" Mar 13 15:06:00 crc kubenswrapper[4898]: E0313 15:06:00.163801 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="extract-content" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.163811 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="extract-content" Mar 13 15:06:00 crc kubenswrapper[4898]: E0313 15:06:00.163834 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="registry-server" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.163844 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="registry-server" Mar 13 15:06:00 crc kubenswrapper[4898]: E0313 15:06:00.163859 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b810f672-a1b5-434f-a031-0044957eebda" containerName="registry-server" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.163869 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b810f672-a1b5-434f-a031-0044957eebda" containerName="registry-server" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.164287 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b810f672-a1b5-434f-a031-0044957eebda" containerName="registry-server" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.164310 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="registry-server" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.165654 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.169242 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.172617 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.172686 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.181145 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556906-m4tq8"] Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.283524 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv6zm\" (UniqueName: \"kubernetes.io/projected/3ee94077-8dd9-4144-bab5-2abd9744fa01-kube-api-access-wv6zm\") pod \"auto-csr-approver-29556906-m4tq8\" (UID: \"3ee94077-8dd9-4144-bab5-2abd9744fa01\") " pod="openshift-infra/auto-csr-approver-29556906-m4tq8" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.385683 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv6zm\" (UniqueName: \"kubernetes.io/projected/3ee94077-8dd9-4144-bab5-2abd9744fa01-kube-api-access-wv6zm\") pod \"auto-csr-approver-29556906-m4tq8\" (UID: \"3ee94077-8dd9-4144-bab5-2abd9744fa01\") " pod="openshift-infra/auto-csr-approver-29556906-m4tq8" Mar 13 15:06:01 crc kubenswrapper[4898]: I0313 15:06:01.031803 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv6zm\" (UniqueName: \"kubernetes.io/projected/3ee94077-8dd9-4144-bab5-2abd9744fa01-kube-api-access-wv6zm\") pod \"auto-csr-approver-29556906-m4tq8\" (UID: \"3ee94077-8dd9-4144-bab5-2abd9744fa01\") " pod="openshift-infra/auto-csr-approver-29556906-m4tq8" Mar 13 15:06:01 crc kubenswrapper[4898]: I0313 15:06:01.097597 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" Mar 13 15:06:01 crc kubenswrapper[4898]: I0313 15:06:01.656517 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556906-m4tq8"] Mar 13 15:06:02 crc kubenswrapper[4898]: I0313 15:06:02.506132 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" event={"ID":"3ee94077-8dd9-4144-bab5-2abd9744fa01","Type":"ContainerStarted","Data":"e8a5dc0ee47b3e7543d44305fe662f785fbd35a6c3446f7da2d5bd45d5d5b1a4"} Mar 13 15:06:03 crc kubenswrapper[4898]: I0313 15:06:03.518486 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" event={"ID":"3ee94077-8dd9-4144-bab5-2abd9744fa01","Type":"ContainerStarted","Data":"64c72ba59c55becc3958f8ce26ee068bc7e3f6141dbbe5fe4984f9a120884dca"} Mar 13 15:06:03 crc kubenswrapper[4898]: I0313 15:06:03.548128 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" podStartSLOduration=2.653404021 podStartE2EDuration="3.548105435s" podCreationTimestamp="2026-03-13 15:06:00 +0000 UTC" firstStartedPulling="2026-03-13 15:06:01.676047617 +0000 UTC m=+4196.677635856" lastFinishedPulling="2026-03-13 15:06:02.570749031 +0000 UTC m=+4197.572337270" observedRunningTime="2026-03-13 15:06:03.535007802 +0000 UTC m=+4198.536596061" watchObservedRunningTime="2026-03-13 15:06:03.548105435 +0000 UTC m=+4198.549693684" Mar 13 15:06:04 crc kubenswrapper[4898]: I0313 15:06:04.533262 4898 generic.go:334] "Generic (PLEG): container finished" podID="3ee94077-8dd9-4144-bab5-2abd9744fa01" containerID="64c72ba59c55becc3958f8ce26ee068bc7e3f6141dbbe5fe4984f9a120884dca" exitCode=0 Mar 13 15:06:04 crc kubenswrapper[4898]: I0313 15:06:04.533350 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" event={"ID":"3ee94077-8dd9-4144-bab5-2abd9744fa01","Type":"ContainerDied","Data":"64c72ba59c55becc3958f8ce26ee068bc7e3f6141dbbe5fe4984f9a120884dca"} Mar 13 15:06:05 crc kubenswrapper[4898]: I0313 15:06:05.963830 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" Mar 13 15:06:06 crc kubenswrapper[4898]: I0313 15:06:06.018050 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv6zm\" (UniqueName: \"kubernetes.io/projected/3ee94077-8dd9-4144-bab5-2abd9744fa01-kube-api-access-wv6zm\") pod \"3ee94077-8dd9-4144-bab5-2abd9744fa01\" (UID: \"3ee94077-8dd9-4144-bab5-2abd9744fa01\") " Mar 13 15:06:06 crc kubenswrapper[4898]: I0313 15:06:06.023453 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ee94077-8dd9-4144-bab5-2abd9744fa01-kube-api-access-wv6zm" (OuterVolumeSpecName: "kube-api-access-wv6zm") pod "3ee94077-8dd9-4144-bab5-2abd9744fa01" (UID: "3ee94077-8dd9-4144-bab5-2abd9744fa01"). InnerVolumeSpecName "kube-api-access-wv6zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:06:06 crc kubenswrapper[4898]: I0313 15:06:06.120662 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv6zm\" (UniqueName: \"kubernetes.io/projected/3ee94077-8dd9-4144-bab5-2abd9744fa01-kube-api-access-wv6zm\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:06 crc kubenswrapper[4898]: I0313 15:06:06.559225 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" event={"ID":"3ee94077-8dd9-4144-bab5-2abd9744fa01","Type":"ContainerDied","Data":"e8a5dc0ee47b3e7543d44305fe662f785fbd35a6c3446f7da2d5bd45d5d5b1a4"} Mar 13 15:06:06 crc kubenswrapper[4898]: I0313 15:06:06.559579 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8a5dc0ee47b3e7543d44305fe662f785fbd35a6c3446f7da2d5bd45d5d5b1a4" Mar 13 15:06:06 crc kubenswrapper[4898]: I0313 15:06:06.559284 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" Mar 13 15:06:06 crc kubenswrapper[4898]: I0313 15:06:06.617747 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556900-qqdxv"] Mar 13 15:06:06 crc kubenswrapper[4898]: I0313 15:06:06.632215 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556900-qqdxv"] Mar 13 15:06:07 crc kubenswrapper[4898]: I0313 15:06:07.756115 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0efa686-df70-493a-92dc-90db2ee67205" path="/var/lib/kubelet/pods/b0efa686-df70-493a-92dc-90db2ee67205/volumes" Mar 13 15:06:26 crc kubenswrapper[4898]: E0313 15:06:26.480934 4898 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:59418->38.102.83.201:43395: write tcp 38.102.83.201:59418->38.102.83.201:43395: write: connection reset by peer Mar 13 15:06:43 crc kubenswrapper[4898]: I0313 15:06:43.114369 4898 scope.go:117] "RemoveContainer" containerID="cc698733d50d55655a41f78a9335173ab8803d13e8dc8d8d6d62ee958bbac18b" Mar 13 15:07:03 crc kubenswrapper[4898]: E0313 15:07:03.315837 4898 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:49748->38.102.83.201:43395: write tcp 38.102.83.201:49748->38.102.83.201:43395: write: broken pipe Mar 13 15:07:49 crc kubenswrapper[4898]: I0313 15:07:49.134304 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:07:49 crc kubenswrapper[4898]: I0313 15:07:49.135112 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.144751 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556908-vplt7"] Mar 13 15:08:00 crc kubenswrapper[4898]: E0313 15:08:00.146235 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee94077-8dd9-4144-bab5-2abd9744fa01" containerName="oc" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.146258 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee94077-8dd9-4144-bab5-2abd9744fa01" containerName="oc" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.146716 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ee94077-8dd9-4144-bab5-2abd9744fa01" containerName="oc" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.148068 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556908-vplt7" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.151073 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.151076 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.151085 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.155832 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556908-vplt7"] Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.305386 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p8sw\" (UniqueName: \"kubernetes.io/projected/df9e42bc-c4a2-4ccc-ad85-5ca077abfd88-kube-api-access-4p8sw\") pod \"auto-csr-approver-29556908-vplt7\" (UID: \"df9e42bc-c4a2-4ccc-ad85-5ca077abfd88\") " pod="openshift-infra/auto-csr-approver-29556908-vplt7" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.407320 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p8sw\" (UniqueName: \"kubernetes.io/projected/df9e42bc-c4a2-4ccc-ad85-5ca077abfd88-kube-api-access-4p8sw\") pod \"auto-csr-approver-29556908-vplt7\" (UID: \"df9e42bc-c4a2-4ccc-ad85-5ca077abfd88\") " pod="openshift-infra/auto-csr-approver-29556908-vplt7" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.436389 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p8sw\" (UniqueName: \"kubernetes.io/projected/df9e42bc-c4a2-4ccc-ad85-5ca077abfd88-kube-api-access-4p8sw\") pod \"auto-csr-approver-29556908-vplt7\" (UID: \"df9e42bc-c4a2-4ccc-ad85-5ca077abfd88\") " pod="openshift-infra/auto-csr-approver-29556908-vplt7" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.483067 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556908-vplt7" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.993911 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556908-vplt7"] Mar 13 15:08:01 crc kubenswrapper[4898]: W0313 15:08:01.010510 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf9e42bc_c4a2_4ccc_ad85_5ca077abfd88.slice/crio-b3b0e247f2a8ff2707210a72a28118243a8690048dfda2f2b07df45c3e2a2ff4 WatchSource:0}: Error finding container b3b0e247f2a8ff2707210a72a28118243a8690048dfda2f2b07df45c3e2a2ff4: Status 404 returned error can't find the container with id b3b0e247f2a8ff2707210a72a28118243a8690048dfda2f2b07df45c3e2a2ff4 Mar 13 15:08:01 crc kubenswrapper[4898]: I0313 15:08:01.044213 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556908-vplt7" event={"ID":"df9e42bc-c4a2-4ccc-ad85-5ca077abfd88","Type":"ContainerStarted","Data":"b3b0e247f2a8ff2707210a72a28118243a8690048dfda2f2b07df45c3e2a2ff4"} Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.287660 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-44hx4"] Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.290345 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.311024 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-44hx4"] Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.459014 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-catalog-content\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.459202 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-utilities\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.459258 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjkvv\" (UniqueName: \"kubernetes.io/projected/4efa3f00-c382-4542-b865-48ff26f025ca-kube-api-access-sjkvv\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.561264 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-catalog-content\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.561348 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-utilities\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.561401 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjkvv\" (UniqueName: \"kubernetes.io/projected/4efa3f00-c382-4542-b865-48ff26f025ca-kube-api-access-sjkvv\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.564128 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-catalog-content\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.564528 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-utilities\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.584806 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjkvv\" (UniqueName: \"kubernetes.io/projected/4efa3f00-c382-4542-b865-48ff26f025ca-kube-api-access-sjkvv\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.614684 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:03 crc kubenswrapper[4898]: I0313 15:08:03.174566 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-44hx4"] Mar 13 15:08:04 crc kubenswrapper[4898]: I0313 15:08:04.084647 4898 generic.go:334] "Generic (PLEG): container finished" podID="4efa3f00-c382-4542-b865-48ff26f025ca" containerID="03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e" exitCode=0 Mar 13 15:08:04 crc kubenswrapper[4898]: I0313 15:08:04.084709 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hx4" event={"ID":"4efa3f00-c382-4542-b865-48ff26f025ca","Type":"ContainerDied","Data":"03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e"} Mar 13 15:08:04 crc kubenswrapper[4898]: I0313 15:08:04.084991 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hx4" event={"ID":"4efa3f00-c382-4542-b865-48ff26f025ca","Type":"ContainerStarted","Data":"5640eeafae0e26e70bf1dca59cc0e1213919ac82acbc06ec1ff1facea138314b"} Mar 13 15:08:04 crc kubenswrapper[4898]: I0313 15:08:04.089246 4898 generic.go:334] "Generic (PLEG): container finished" podID="df9e42bc-c4a2-4ccc-ad85-5ca077abfd88" containerID="020c072c01677481578a21e99a6c39f8522847520765e8695da295955dd3e290" exitCode=0 Mar 13 15:08:04 crc kubenswrapper[4898]: I0313 15:08:04.089320 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556908-vplt7" event={"ID":"df9e42bc-c4a2-4ccc-ad85-5ca077abfd88","Type":"ContainerDied","Data":"020c072c01677481578a21e99a6c39f8522847520765e8695da295955dd3e290"} Mar 13 15:08:05 crc kubenswrapper[4898]: I0313 15:08:05.103618 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hx4" event={"ID":"4efa3f00-c382-4542-b865-48ff26f025ca","Type":"ContainerStarted","Data":"ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3"} Mar 13 15:08:05 crc kubenswrapper[4898]: I0313 15:08:05.517020 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556908-vplt7" Mar 13 15:08:05 crc kubenswrapper[4898]: I0313 15:08:05.707893 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p8sw\" (UniqueName: \"kubernetes.io/projected/df9e42bc-c4a2-4ccc-ad85-5ca077abfd88-kube-api-access-4p8sw\") pod \"df9e42bc-c4a2-4ccc-ad85-5ca077abfd88\" (UID: \"df9e42bc-c4a2-4ccc-ad85-5ca077abfd88\") " Mar 13 15:08:05 crc kubenswrapper[4898]: I0313 15:08:05.728762 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df9e42bc-c4a2-4ccc-ad85-5ca077abfd88-kube-api-access-4p8sw" (OuterVolumeSpecName: "kube-api-access-4p8sw") pod "df9e42bc-c4a2-4ccc-ad85-5ca077abfd88" (UID: "df9e42bc-c4a2-4ccc-ad85-5ca077abfd88"). InnerVolumeSpecName "kube-api-access-4p8sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:08:05 crc kubenswrapper[4898]: I0313 15:08:05.812698 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p8sw\" (UniqueName: \"kubernetes.io/projected/df9e42bc-c4a2-4ccc-ad85-5ca077abfd88-kube-api-access-4p8sw\") on node \"crc\" DevicePath \"\"" Mar 13 15:08:06 crc kubenswrapper[4898]: I0313 15:08:06.120313 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556908-vplt7" event={"ID":"df9e42bc-c4a2-4ccc-ad85-5ca077abfd88","Type":"ContainerDied","Data":"b3b0e247f2a8ff2707210a72a28118243a8690048dfda2f2b07df45c3e2a2ff4"} Mar 13 15:08:06 crc kubenswrapper[4898]: I0313 15:08:06.120600 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3b0e247f2a8ff2707210a72a28118243a8690048dfda2f2b07df45c3e2a2ff4" Mar 13 15:08:06 crc kubenswrapper[4898]: I0313 15:08:06.120365 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556908-vplt7" Mar 13 15:08:06 crc kubenswrapper[4898]: I0313 15:08:06.628855 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556902-s4cg2"] Mar 13 15:08:06 crc kubenswrapper[4898]: I0313 15:08:06.638619 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556902-s4cg2"] Mar 13 15:08:07 crc kubenswrapper[4898]: I0313 15:08:07.136346 4898 generic.go:334] "Generic (PLEG): container finished" podID="4efa3f00-c382-4542-b865-48ff26f025ca" containerID="ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3" exitCode=0 Mar 13 15:08:07 crc kubenswrapper[4898]: I0313 15:08:07.136653 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hx4" event={"ID":"4efa3f00-c382-4542-b865-48ff26f025ca","Type":"ContainerDied","Data":"ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3"} Mar 13 15:08:07 crc kubenswrapper[4898]: I0313 15:08:07.766094 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d9d3379-4b7b-4263-aec1-10c06dc087e6" path="/var/lib/kubelet/pods/6d9d3379-4b7b-4263-aec1-10c06dc087e6/volumes" Mar 13 15:08:08 crc kubenswrapper[4898]: I0313 15:08:08.162100 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hx4" event={"ID":"4efa3f00-c382-4542-b865-48ff26f025ca","Type":"ContainerStarted","Data":"9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe"} Mar 13 15:08:08 crc kubenswrapper[4898]: I0313 15:08:08.199827 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-44hx4" podStartSLOduration=2.6529622330000002 podStartE2EDuration="6.199800517s" podCreationTimestamp="2026-03-13 15:08:02 +0000 UTC" firstStartedPulling="2026-03-13 15:08:04.087385007 +0000 UTC m=+4319.088973246" lastFinishedPulling="2026-03-13 15:08:07.634223261 +0000 UTC m=+4322.635811530" observedRunningTime="2026-03-13 15:08:08.191319628 +0000 UTC m=+4323.192907887" watchObservedRunningTime="2026-03-13 15:08:08.199800517 +0000 UTC m=+4323.201388786" Mar 13 15:08:12 crc kubenswrapper[4898]: I0313 15:08:12.615154 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:12 crc kubenswrapper[4898]: I0313 15:08:12.617039 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:12 crc kubenswrapper[4898]: I0313 15:08:12.706695 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:13 crc kubenswrapper[4898]: I0313 15:08:13.318575 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:13 crc kubenswrapper[4898]: I0313 15:08:13.408435 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-44hx4"] Mar 13 15:08:15 crc kubenswrapper[4898]: I0313 15:08:15.283270 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-44hx4" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" containerName="registry-server" containerID="cri-o://9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe" gracePeriod=2 Mar 13 15:08:15 crc kubenswrapper[4898]: I0313 15:08:15.881769 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:15 crc kubenswrapper[4898]: I0313 15:08:15.999605 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjkvv\" (UniqueName: \"kubernetes.io/projected/4efa3f00-c382-4542-b865-48ff26f025ca-kube-api-access-sjkvv\") pod \"4efa3f00-c382-4542-b865-48ff26f025ca\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:15.999979 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-catalog-content\") pod \"4efa3f00-c382-4542-b865-48ff26f025ca\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.000159 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-utilities\") pod \"4efa3f00-c382-4542-b865-48ff26f025ca\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.000938 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-utilities" (OuterVolumeSpecName: "utilities") pod "4efa3f00-c382-4542-b865-48ff26f025ca" (UID: "4efa3f00-c382-4542-b865-48ff26f025ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.021737 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4efa3f00-c382-4542-b865-48ff26f025ca-kube-api-access-sjkvv" (OuterVolumeSpecName: "kube-api-access-sjkvv") pod "4efa3f00-c382-4542-b865-48ff26f025ca" (UID: "4efa3f00-c382-4542-b865-48ff26f025ca"). InnerVolumeSpecName "kube-api-access-sjkvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.082585 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4efa3f00-c382-4542-b865-48ff26f025ca" (UID: "4efa3f00-c382-4542-b865-48ff26f025ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.108065 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.108118 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjkvv\" (UniqueName: \"kubernetes.io/projected/4efa3f00-c382-4542-b865-48ff26f025ca-kube-api-access-sjkvv\") on node \"crc\" DevicePath \"\"" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.108140 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.300070 4898 generic.go:334] "Generic (PLEG): container finished" podID="4efa3f00-c382-4542-b865-48ff26f025ca" containerID="9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe" exitCode=0 Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.300148 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hx4" event={"ID":"4efa3f00-c382-4542-b865-48ff26f025ca","Type":"ContainerDied","Data":"9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe"} Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.300274 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.300318 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hx4" event={"ID":"4efa3f00-c382-4542-b865-48ff26f025ca","Type":"ContainerDied","Data":"5640eeafae0e26e70bf1dca59cc0e1213919ac82acbc06ec1ff1facea138314b"} Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.300340 4898 scope.go:117] "RemoveContainer" containerID="9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.337047 4898 scope.go:117] "RemoveContainer" containerID="ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.358621 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-44hx4"] Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.364549 4898 scope.go:117] "RemoveContainer" containerID="03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.374453 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-44hx4"] Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.443341 4898 scope.go:117] "RemoveContainer" containerID="9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe" Mar 13 15:08:16 crc kubenswrapper[4898]: E0313 15:08:16.444088 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe\": container with ID starting with 9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe not found: ID does not exist" containerID="9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.444128 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe"} err="failed to get container status \"9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe\": rpc error: code = NotFound desc = could not find container \"9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe\": container with ID starting with 9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe not found: ID does not exist" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.444154 4898 scope.go:117] "RemoveContainer" containerID="ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3" Mar 13 15:08:16 crc kubenswrapper[4898]: E0313 15:08:16.444583 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3\": container with ID starting with ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3 not found: ID does not exist" containerID="ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.444606 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3"} err="failed to get container status \"ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3\": rpc error: code = NotFound desc = could not find container \"ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3\": container with ID starting with ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3 not found: ID does not exist" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.444619 4898 scope.go:117] "RemoveContainer" containerID="03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e" Mar 13 15:08:16 crc kubenswrapper[4898]: E0313 15:08:16.445037 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e\": container with ID starting with 03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e not found: ID does not exist" containerID="03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.445070 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e"} err="failed to get container status \"03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e\": rpc error: code = NotFound desc = could not find container \"03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e\": container with ID starting with 03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e not found: ID does not exist" Mar 13 15:08:17 crc kubenswrapper[4898]: I0313 15:08:17.762856 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" path="/var/lib/kubelet/pods/4efa3f00-c382-4542-b865-48ff26f025ca/volumes" Mar 13 15:08:19 crc kubenswrapper[4898]: I0313 15:08:19.134438 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:08:19 crc kubenswrapper[4898]: I0313 15:08:19.134866 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:08:43 crc kubenswrapper[4898]: I0313 15:08:43.263797 4898 scope.go:117] "RemoveContainer" containerID="e906ab6657ba9367ad2d982719bc7c2ae3c8f44a0ea2bd20f34b397b25a0a265" Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.134680 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.135332 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.135395 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.136548 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.136650 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" gracePeriod=600 Mar 13 15:08:49 crc kubenswrapper[4898]: E0313 15:08:49.270185 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.769864 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a"} Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.771209 4898 scope.go:117] "RemoveContainer" containerID="45a214eae7c7c7859e650d708f47b93bb385cc743f9e3e627bde30ab582e5ff5" Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.771649 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" exitCode=0 Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.772111 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:08:49 crc kubenswrapper[4898]: E0313 15:08:49.773312 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:09:03 crc kubenswrapper[4898]: I0313 15:09:03.739823 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:09:03 crc kubenswrapper[4898]: E0313 15:09:03.740836 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:09:14 crc kubenswrapper[4898]: I0313 15:09:14.739858 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:09:14 crc kubenswrapper[4898]: E0313 15:09:14.740881 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:09:25 crc kubenswrapper[4898]: I0313 15:09:25.751968 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:09:25 crc kubenswrapper[4898]: E0313 15:09:25.752979 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:09:38 crc kubenswrapper[4898]: I0313 15:09:38.740005 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:09:38 crc kubenswrapper[4898]: E0313 15:09:38.741398 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:09:50 crc kubenswrapper[4898]: I0313 15:09:50.740364 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:09:50 crc kubenswrapper[4898]: E0313 15:09:50.742672 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.149494 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556910-lcc5d"] Mar 13 15:10:00 crc kubenswrapper[4898]: E0313 15:10:00.150539 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.150554 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4898]: E0313 15:10:00.150577 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" containerName="extract-content" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.150583 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" containerName="extract-content" Mar 13 15:10:00 crc kubenswrapper[4898]: E0313 15:10:00.150625 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" containerName="extract-utilities" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.150632 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" containerName="extract-utilities" Mar 13 15:10:00 crc kubenswrapper[4898]: E0313 15:10:00.150639 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df9e42bc-c4a2-4ccc-ad85-5ca077abfd88" containerName="oc" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.150644 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9e42bc-c4a2-4ccc-ad85-5ca077abfd88" containerName="oc" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.150849 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.150867 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="df9e42bc-c4a2-4ccc-ad85-5ca077abfd88" containerName="oc" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.151693 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.154129 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.154529 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.160066 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.162281 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556910-lcc5d"] Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.258363 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/0a36f55a-ce22-4339-967f-906f473ddad5-kube-api-access-zqhd4\") pod \"auto-csr-approver-29556910-lcc5d\" (UID: \"0a36f55a-ce22-4339-967f-906f473ddad5\") " pod="openshift-infra/auto-csr-approver-29556910-lcc5d" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.360995 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/0a36f55a-ce22-4339-967f-906f473ddad5-kube-api-access-zqhd4\") pod \"auto-csr-approver-29556910-lcc5d\" (UID: \"0a36f55a-ce22-4339-967f-906f473ddad5\") " pod="openshift-infra/auto-csr-approver-29556910-lcc5d" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.381725 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/0a36f55a-ce22-4339-967f-906f473ddad5-kube-api-access-zqhd4\") pod \"auto-csr-approver-29556910-lcc5d\" (UID: \"0a36f55a-ce22-4339-967f-906f473ddad5\") " pod="openshift-infra/auto-csr-approver-29556910-lcc5d" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.470017 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.970628 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556910-lcc5d"] Mar 13 15:10:01 crc kubenswrapper[4898]: I0313 15:10:01.699577 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" event={"ID":"0a36f55a-ce22-4339-967f-906f473ddad5","Type":"ContainerStarted","Data":"64be457f6495c2066ecea207638f74ac02dc99b45865b08dd645709fbe9adcb6"} Mar 13 15:10:02 crc kubenswrapper[4898]: I0313 15:10:02.740558 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:10:02 crc kubenswrapper[4898]: E0313 15:10:02.741501 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:10:03 crc kubenswrapper[4898]: I0313 15:10:03.727608 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" event={"ID":"0a36f55a-ce22-4339-967f-906f473ddad5","Type":"ContainerStarted","Data":"ba349dae26dd37d5b178e79f9ed4076346dab25c90569fb520b86b35b588e387"} Mar 13 15:10:03 crc kubenswrapper[4898]: I0313 15:10:03.748652 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" podStartSLOduration=2.169812902 podStartE2EDuration="3.748627989s" podCreationTimestamp="2026-03-13 15:10:00 +0000 UTC" firstStartedPulling="2026-03-13 15:10:00.979025317 +0000 UTC m=+4435.980613556" lastFinishedPulling="2026-03-13 15:10:02.557840404 +0000 UTC m=+4437.559428643" observedRunningTime="2026-03-13 15:10:03.744684561 +0000 UTC m=+4438.746272800" watchObservedRunningTime="2026-03-13 15:10:03.748627989 +0000 UTC m=+4438.750216238" Mar 13 15:10:04 crc kubenswrapper[4898]: I0313 15:10:04.737244 4898 generic.go:334] "Generic (PLEG): container finished" podID="0a36f55a-ce22-4339-967f-906f473ddad5" containerID="ba349dae26dd37d5b178e79f9ed4076346dab25c90569fb520b86b35b588e387" exitCode=0 Mar 13 15:10:04 crc kubenswrapper[4898]: I0313 15:10:04.737289 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" event={"ID":"0a36f55a-ce22-4339-967f-906f473ddad5","Type":"ContainerDied","Data":"ba349dae26dd37d5b178e79f9ed4076346dab25c90569fb520b86b35b588e387"} Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.170455 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.203979 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/0a36f55a-ce22-4339-967f-906f473ddad5-kube-api-access-zqhd4\") pod \"0a36f55a-ce22-4339-967f-906f473ddad5\" (UID: \"0a36f55a-ce22-4339-967f-906f473ddad5\") " Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.210004 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a36f55a-ce22-4339-967f-906f473ddad5-kube-api-access-zqhd4" (OuterVolumeSpecName: "kube-api-access-zqhd4") pod "0a36f55a-ce22-4339-967f-906f473ddad5" (UID: "0a36f55a-ce22-4339-967f-906f473ddad5"). InnerVolumeSpecName "kube-api-access-zqhd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.308293 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/0a36f55a-ce22-4339-967f-906f473ddad5-kube-api-access-zqhd4\") on node \"crc\" DevicePath \"\"" Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.771017 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" event={"ID":"0a36f55a-ce22-4339-967f-906f473ddad5","Type":"ContainerDied","Data":"64be457f6495c2066ecea207638f74ac02dc99b45865b08dd645709fbe9adcb6"} Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.771058 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64be457f6495c2066ecea207638f74ac02dc99b45865b08dd645709fbe9adcb6" Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.771127 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.843883 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556904-69dtq"] Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.854388 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556904-69dtq"] Mar 13 15:10:07 crc kubenswrapper[4898]: I0313 15:10:07.753465 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0" path="/var/lib/kubelet/pods/a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0/volumes" Mar 13 15:10:15 crc kubenswrapper[4898]: I0313 15:10:15.748141 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:10:15 crc kubenswrapper[4898]: E0313 15:10:15.749123 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.701961 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dpqz2"] Mar 13 15:10:23 crc kubenswrapper[4898]: E0313 15:10:23.703402 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a36f55a-ce22-4339-967f-906f473ddad5" containerName="oc" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.703426 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a36f55a-ce22-4339-967f-906f473ddad5" containerName="oc" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.703811 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a36f55a-ce22-4339-967f-906f473ddad5" containerName="oc" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.706619 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.712551 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpqz2"] Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.860229 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-catalog-content\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.860275 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcvz8\" (UniqueName: \"kubernetes.io/projected/6ae9befa-ba34-402d-9c68-4aad13ad380a-kube-api-access-wcvz8\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.860355 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-utilities\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.962251 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-utilities\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.962452 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-catalog-content\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.962481 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcvz8\" (UniqueName: \"kubernetes.io/projected/6ae9befa-ba34-402d-9c68-4aad13ad380a-kube-api-access-wcvz8\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.963266 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-utilities\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.963545 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-catalog-content\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.987853 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcvz8\" (UniqueName: \"kubernetes.io/projected/6ae9befa-ba34-402d-9c68-4aad13ad380a-kube-api-access-wcvz8\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:24 crc kubenswrapper[4898]: I0313 15:10:24.035674 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:24 crc kubenswrapper[4898]: I0313 15:10:24.552225 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpqz2"] Mar 13 15:10:25 crc kubenswrapper[4898]: I0313 15:10:25.002960 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerID="26a0efcf86b49360d3ca0f6db51f8be8241064695e81797fdbbea93b417ae346" exitCode=0 Mar 13 15:10:25 crc kubenswrapper[4898]: I0313 15:10:25.003271 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpqz2" event={"ID":"6ae9befa-ba34-402d-9c68-4aad13ad380a","Type":"ContainerDied","Data":"26a0efcf86b49360d3ca0f6db51f8be8241064695e81797fdbbea93b417ae346"} Mar 13 15:10:25 crc kubenswrapper[4898]: I0313 15:10:25.003300 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpqz2" event={"ID":"6ae9befa-ba34-402d-9c68-4aad13ad380a","Type":"ContainerStarted","Data":"6d58b2a80b4f25b60cebd9b2997bdbd7ee5cfa0f69c0a56fd8d0e8800a868b77"} Mar 13 15:10:25 crc kubenswrapper[4898]: I0313 15:10:25.005197 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:10:26 crc kubenswrapper[4898]: I0313 15:10:26.017382 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpqz2" event={"ID":"6ae9befa-ba34-402d-9c68-4aad13ad380a","Type":"ContainerStarted","Data":"1131b0202620c67ed5c6fc2a5f10369017ac4f0c291aa0a69276d20706cb8627"} Mar 13 15:10:27 crc kubenswrapper[4898]: I0313 15:10:27.032079 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerID="1131b0202620c67ed5c6fc2a5f10369017ac4f0c291aa0a69276d20706cb8627" exitCode=0 Mar 13 15:10:27 crc kubenswrapper[4898]: I0313 15:10:27.032376 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpqz2" event={"ID":"6ae9befa-ba34-402d-9c68-4aad13ad380a","Type":"ContainerDied","Data":"1131b0202620c67ed5c6fc2a5f10369017ac4f0c291aa0a69276d20706cb8627"} Mar 13 15:10:29 crc kubenswrapper[4898]: I0313 15:10:29.055961 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpqz2" event={"ID":"6ae9befa-ba34-402d-9c68-4aad13ad380a","Type":"ContainerStarted","Data":"ef58acf8d4ae6204788d3fdc59aecb3462e5ceb48474e990c3e92ca73408a0f8"} Mar 13 15:10:29 crc kubenswrapper[4898]: I0313 15:10:29.080944 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dpqz2" podStartSLOduration=3.608435736 podStartE2EDuration="6.08092118s" podCreationTimestamp="2026-03-13 15:10:23 +0000 UTC" firstStartedPulling="2026-03-13 15:10:25.004951201 +0000 UTC m=+4460.006539440" lastFinishedPulling="2026-03-13 15:10:27.477436615 +0000 UTC m=+4462.479024884" observedRunningTime="2026-03-13 15:10:29.070617386 +0000 UTC m=+4464.072205645" watchObservedRunningTime="2026-03-13 15:10:29.08092118 +0000 UTC m=+4464.082509419" Mar 13 15:10:30 crc kubenswrapper[4898]: I0313 15:10:30.739834 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:10:30 crc kubenswrapper[4898]: E0313 15:10:30.740444 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:10:34 crc kubenswrapper[4898]: I0313 15:10:34.045636 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:34 crc kubenswrapper[4898]: I0313 15:10:34.046096 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:34 crc kubenswrapper[4898]: I0313 15:10:34.103917 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:34 crc kubenswrapper[4898]: I0313 15:10:34.170429 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:34 crc kubenswrapper[4898]: I0313 15:10:34.675004 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpqz2"] Mar 13 15:10:36 crc kubenswrapper[4898]: I0313 15:10:36.131435 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dpqz2" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerName="registry-server" containerID="cri-o://ef58acf8d4ae6204788d3fdc59aecb3462e5ceb48474e990c3e92ca73408a0f8" gracePeriod=2 Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.144642 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerID="ef58acf8d4ae6204788d3fdc59aecb3462e5ceb48474e990c3e92ca73408a0f8" exitCode=0 Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.144870 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpqz2" event={"ID":"6ae9befa-ba34-402d-9c68-4aad13ad380a","Type":"ContainerDied","Data":"ef58acf8d4ae6204788d3fdc59aecb3462e5ceb48474e990c3e92ca73408a0f8"} Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.144958 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpqz2" event={"ID":"6ae9befa-ba34-402d-9c68-4aad13ad380a","Type":"ContainerDied","Data":"6d58b2a80b4f25b60cebd9b2997bdbd7ee5cfa0f69c0a56fd8d0e8800a868b77"} Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.144979 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d58b2a80b4f25b60cebd9b2997bdbd7ee5cfa0f69c0a56fd8d0e8800a868b77" Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.177661 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.287964 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-catalog-content\") pod \"6ae9befa-ba34-402d-9c68-4aad13ad380a\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.288284 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-utilities\") pod \"6ae9befa-ba34-402d-9c68-4aad13ad380a\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.288428 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcvz8\" (UniqueName: \"kubernetes.io/projected/6ae9befa-ba34-402d-9c68-4aad13ad380a-kube-api-access-wcvz8\") pod \"6ae9befa-ba34-402d-9c68-4aad13ad380a\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.289667 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-utilities" (OuterVolumeSpecName: "utilities") pod "6ae9befa-ba34-402d-9c68-4aad13ad380a" (UID: "6ae9befa-ba34-402d-9c68-4aad13ad380a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.300520 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ae9befa-ba34-402d-9c68-4aad13ad380a-kube-api-access-wcvz8" (OuterVolumeSpecName: "kube-api-access-wcvz8") pod "6ae9befa-ba34-402d-9c68-4aad13ad380a" (UID: "6ae9befa-ba34-402d-9c68-4aad13ad380a"). InnerVolumeSpecName "kube-api-access-wcvz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.312522 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ae9befa-ba34-402d-9c68-4aad13ad380a" (UID: "6ae9befa-ba34-402d-9c68-4aad13ad380a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.391026 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.391073 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcvz8\" (UniqueName: \"kubernetes.io/projected/6ae9befa-ba34-402d-9c68-4aad13ad380a-kube-api-access-wcvz8\") on node \"crc\" DevicePath \"\"" Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.391090 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:10:38 crc kubenswrapper[4898]: I0313 15:10:38.155036 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:38 crc kubenswrapper[4898]: I0313 15:10:38.184849 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpqz2"] Mar 13 15:10:38 crc kubenswrapper[4898]: I0313 15:10:38.197365 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpqz2"] Mar 13 15:10:39 crc kubenswrapper[4898]: I0313 15:10:39.759990 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" path="/var/lib/kubelet/pods/6ae9befa-ba34-402d-9c68-4aad13ad380a/volumes" Mar 13 15:10:43 crc kubenswrapper[4898]: I0313 15:10:43.422973 4898 scope.go:117] "RemoveContainer" containerID="cd5d0aba2324832ebc654e623ddb6b3b9935920ad75023c9232ebc2ff78ae2d3" Mar 13 15:10:45 crc kubenswrapper[4898]: I0313 15:10:45.761088 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:10:45 crc kubenswrapper[4898]: E0313 15:10:45.762415 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:11:00 crc kubenswrapper[4898]: I0313 15:11:00.739251 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:11:00 crc kubenswrapper[4898]: E0313 15:11:00.740005 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:11:11 crc kubenswrapper[4898]: I0313 15:11:11.739455 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:11:11 crc kubenswrapper[4898]: E0313 15:11:11.740180 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:11:24 crc kubenswrapper[4898]: I0313 15:11:24.741618 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:11:24 crc kubenswrapper[4898]: E0313 15:11:24.743103 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:11:36 crc kubenswrapper[4898]: I0313 15:11:36.739419 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:11:36 crc kubenswrapper[4898]: E0313 15:11:36.740326 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:11:51 crc kubenswrapper[4898]: I0313 15:11:51.743604 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:11:51 crc kubenswrapper[4898]: E0313 15:11:51.744578 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.154356 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556912-5fmmb"] Mar 13 15:12:00 crc kubenswrapper[4898]: E0313 15:12:00.155576 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerName="extract-utilities" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.155596 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerName="extract-utilities" Mar 13 15:12:00 crc kubenswrapper[4898]: E0313 15:12:00.155611 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerName="extract-content" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.155620 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerName="extract-content" Mar 13 15:12:00 crc kubenswrapper[4898]: E0313 15:12:00.155634 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerName="registry-server" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.155641 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerName="registry-server" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.155868 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerName="registry-server" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.156650 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556912-5fmmb" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.167865 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556912-5fmmb"] Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.174130 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.174659 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.174949 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.208448 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs292\" (UniqueName: \"kubernetes.io/projected/ada9e0ac-777e-4e64-aade-d729b4481edf-kube-api-access-cs292\") pod \"auto-csr-approver-29556912-5fmmb\" (UID: \"ada9e0ac-777e-4e64-aade-d729b4481edf\") " pod="openshift-infra/auto-csr-approver-29556912-5fmmb" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.313361 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs292\" (UniqueName: \"kubernetes.io/projected/ada9e0ac-777e-4e64-aade-d729b4481edf-kube-api-access-cs292\") pod \"auto-csr-approver-29556912-5fmmb\" (UID: \"ada9e0ac-777e-4e64-aade-d729b4481edf\") " pod="openshift-infra/auto-csr-approver-29556912-5fmmb" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.330522 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs292\" (UniqueName: \"kubernetes.io/projected/ada9e0ac-777e-4e64-aade-d729b4481edf-kube-api-access-cs292\") pod \"auto-csr-approver-29556912-5fmmb\" (UID: \"ada9e0ac-777e-4e64-aade-d729b4481edf\") " pod="openshift-infra/auto-csr-approver-29556912-5fmmb" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.494803 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556912-5fmmb" Mar 13 15:12:01 crc kubenswrapper[4898]: I0313 15:12:01.035382 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556912-5fmmb"] Mar 13 15:12:01 crc kubenswrapper[4898]: I0313 15:12:01.124089 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556912-5fmmb" event={"ID":"ada9e0ac-777e-4e64-aade-d729b4481edf","Type":"ContainerStarted","Data":"334b07d5a1c8b89023427a21780b8ae4851543bace4ad6dbdd28114bdfd3e287"} Mar 13 15:12:04 crc kubenswrapper[4898]: I0313 15:12:04.160000 4898 generic.go:334] "Generic (PLEG): container finished" podID="ada9e0ac-777e-4e64-aade-d729b4481edf" containerID="ac390920dd30738022bc9982651d5e7fa6b628c845272ba7744c86bf9f8444e9" exitCode=0 Mar 13 15:12:04 crc kubenswrapper[4898]: I0313 15:12:04.160116 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556912-5fmmb" event={"ID":"ada9e0ac-777e-4e64-aade-d729b4481edf","Type":"ContainerDied","Data":"ac390920dd30738022bc9982651d5e7fa6b628c845272ba7744c86bf9f8444e9"} Mar 13 15:12:05 crc kubenswrapper[4898]: I0313 15:12:05.722104 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556912-5fmmb" Mar 13 15:12:05 crc kubenswrapper[4898]: I0313 15:12:05.754052 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:12:05 crc kubenswrapper[4898]: E0313 15:12:05.754409 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:12:05 crc kubenswrapper[4898]: I0313 15:12:05.798978 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs292\" (UniqueName: \"kubernetes.io/projected/ada9e0ac-777e-4e64-aade-d729b4481edf-kube-api-access-cs292\") pod \"ada9e0ac-777e-4e64-aade-d729b4481edf\" (UID: \"ada9e0ac-777e-4e64-aade-d729b4481edf\") " Mar 13 15:12:05 crc kubenswrapper[4898]: I0313 15:12:05.806495 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada9e0ac-777e-4e64-aade-d729b4481edf-kube-api-access-cs292" (OuterVolumeSpecName: "kube-api-access-cs292") pod "ada9e0ac-777e-4e64-aade-d729b4481edf" (UID: "ada9e0ac-777e-4e64-aade-d729b4481edf"). InnerVolumeSpecName "kube-api-access-cs292". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:12:05 crc kubenswrapper[4898]: I0313 15:12:05.902655 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs292\" (UniqueName: \"kubernetes.io/projected/ada9e0ac-777e-4e64-aade-d729b4481edf-kube-api-access-cs292\") on node \"crc\" DevicePath \"\"" Mar 13 15:12:06 crc kubenswrapper[4898]: I0313 15:12:06.189312 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556912-5fmmb" event={"ID":"ada9e0ac-777e-4e64-aade-d729b4481edf","Type":"ContainerDied","Data":"334b07d5a1c8b89023427a21780b8ae4851543bace4ad6dbdd28114bdfd3e287"} Mar 13 15:12:06 crc kubenswrapper[4898]: I0313 15:12:06.189348 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="334b07d5a1c8b89023427a21780b8ae4851543bace4ad6dbdd28114bdfd3e287" Mar 13 15:12:06 crc kubenswrapper[4898]: I0313 15:12:06.189408 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556912-5fmmb" Mar 13 15:12:06 crc kubenswrapper[4898]: I0313 15:12:06.826492 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556906-m4tq8"] Mar 13 15:12:06 crc kubenswrapper[4898]: I0313 15:12:06.849070 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556906-m4tq8"] Mar 13 15:12:07 crc kubenswrapper[4898]: I0313 15:12:07.764884 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ee94077-8dd9-4144-bab5-2abd9744fa01" path="/var/lib/kubelet/pods/3ee94077-8dd9-4144-bab5-2abd9744fa01/volumes" Mar 13 15:12:16 crc kubenswrapper[4898]: I0313 15:12:16.739889 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:12:16 crc kubenswrapper[4898]: E0313 15:12:16.741153 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.787454 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 15:12:22 crc kubenswrapper[4898]: E0313 15:12:22.788887 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada9e0ac-777e-4e64-aade-d729b4481edf" containerName="oc" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.788932 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada9e0ac-777e-4e64-aade-d729b4481edf" containerName="oc" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.789202 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada9e0ac-777e-4e64-aade-d729b4481edf" containerName="oc" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.790220 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.792782 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8v5gs" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.793011 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.794126 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.794159 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.823040 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.836646 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.836737 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-config-data\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.836770 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.836876 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.837889 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.838123 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.838213 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.838263 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.838349 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqc45\" (UniqueName: \"kubernetes.io/projected/d19e8770-f0c1-491e-96c9-f737386ab3b0-kube-api-access-mqc45\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940370 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940431 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-config-data\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940449 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940513 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940591 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940647 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940712 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940743 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqc45\" (UniqueName: \"kubernetes.io/projected/d19e8770-f0c1-491e-96c9-f737386ab3b0-kube-api-access-mqc45\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.941757 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.942662 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.942773 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-config-data\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.943278 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.945612 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.947800 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.947801 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.948630 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.978088 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqc45\" (UniqueName: \"kubernetes.io/projected/d19e8770-f0c1-491e-96c9-f737386ab3b0-kube-api-access-mqc45\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.996863 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:23 crc kubenswrapper[4898]: I0313 15:12:23.156961 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 15:12:23 crc kubenswrapper[4898]: I0313 15:12:23.677312 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 15:12:24 crc kubenswrapper[4898]: I0313 15:12:24.439725 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d19e8770-f0c1-491e-96c9-f737386ab3b0","Type":"ContainerStarted","Data":"afe41cfa21ca0ff15752a7557715bbecbe54855edbe2061b3b072582d6fab3b3"} Mar 13 15:12:25 crc kubenswrapper[4898]: E0313 15:12:25.203869 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/8b/8b47586b9dc9859845a0009766c3842adb98f7625670e094cb04f6c938ff2e60?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T151224Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=b388b6763fa77f376478a9daff15dddc4e86f9a95b88eb81ecf62c8aceedcbcd®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-tempest-all&akamai_signature=exp=1773415644~hmac=0abc8e338520ac5b8bf28d10fb829d398c4466399fe27974a04e18a24f6702d9\": remote error: tls: internal error" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 13 15:12:25 crc kubenswrapper[4898]: E0313 15:12:25.204454 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mqc45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(d19e8770-f0c1-491e-96c9-f737386ab3b0): ErrImagePull: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/8b/8b47586b9dc9859845a0009766c3842adb98f7625670e094cb04f6c938ff2e60?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T151224Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=b388b6763fa77f376478a9daff15dddc4e86f9a95b88eb81ecf62c8aceedcbcd®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-tempest-all&akamai_signature=exp=1773415644~hmac=0abc8e338520ac5b8bf28d10fb829d398c4466399fe27974a04e18a24f6702d9\": remote error: tls: internal error" logger="UnhandledError" Mar 13 15:12:25 crc kubenswrapper[4898]: E0313 15:12:25.205836 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/8b/8b47586b9dc9859845a0009766c3842adb98f7625670e094cb04f6c938ff2e60?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T151224Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=b388b6763fa77f376478a9daff15dddc4e86f9a95b88eb81ecf62c8aceedcbcd®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-tempest-all&akamai_signature=exp=1773415644~hmac=0abc8e338520ac5b8bf28d10fb829d398c4466399fe27974a04e18a24f6702d9\\\": remote error: tls: internal error\"" pod="openstack/tempest-tests-tempest" podUID="d19e8770-f0c1-491e-96c9-f737386ab3b0" Mar 13 15:12:25 crc kubenswrapper[4898]: E0313 15:12:25.460151 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="d19e8770-f0c1-491e-96c9-f737386ab3b0" Mar 13 15:12:31 crc kubenswrapper[4898]: I0313 15:12:31.739882 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:12:31 crc kubenswrapper[4898]: E0313 15:12:31.740967 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:12:43 crc kubenswrapper[4898]: I0313 15:12:43.556166 4898 scope.go:117] "RemoveContainer" containerID="64c72ba59c55becc3958f8ce26ee068bc7e3f6141dbbe5fe4984f9a120884dca" Mar 13 15:12:43 crc kubenswrapper[4898]: I0313 15:12:43.740363 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:12:43 crc kubenswrapper[4898]: E0313 15:12:43.741239 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:12:56 crc kubenswrapper[4898]: I0313 15:12:56.739869 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:12:56 crc kubenswrapper[4898]: E0313 15:12:56.740760 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:13:10 crc kubenswrapper[4898]: I0313 15:13:10.740534 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:13:10 crc kubenswrapper[4898]: E0313 15:13:10.742698 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:13:16 crc kubenswrapper[4898]: E0313 15:13:16.053589 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 13 15:13:16 crc kubenswrapper[4898]: E0313 15:13:16.054431 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mqc45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(d19e8770-f0c1-491e-96c9-f737386ab3b0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 15:13:16 crc kubenswrapper[4898]: E0313 15:13:16.055719 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="d19e8770-f0c1-491e-96c9-f737386ab3b0" Mar 13 15:13:23 crc kubenswrapper[4898]: I0313 15:13:23.742169 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:13:23 crc kubenswrapper[4898]: E0313 15:13:23.744341 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:13:30 crc kubenswrapper[4898]: E0313 15:13:30.742513 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="d19e8770-f0c1-491e-96c9-f737386ab3b0" Mar 13 15:13:37 crc kubenswrapper[4898]: I0313 15:13:37.740957 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:13:37 crc kubenswrapper[4898]: E0313 15:13:37.742434 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:13:43 crc kubenswrapper[4898]: I0313 15:13:43.207620 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 13 15:13:45 crc kubenswrapper[4898]: I0313 15:13:45.460317 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d19e8770-f0c1-491e-96c9-f737386ab3b0","Type":"ContainerStarted","Data":"f6562a9a91d72757d77dbf107969b6ab33c4a3a7219f9b57d0df0ee10184af60"} Mar 13 15:13:45 crc kubenswrapper[4898]: I0313 15:13:45.485391 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.965285636 podStartE2EDuration="1m24.485371067s" podCreationTimestamp="2026-03-13 15:12:21 +0000 UTC" firstStartedPulling="2026-03-13 15:12:23.684464892 +0000 UTC m=+4578.686053131" lastFinishedPulling="2026-03-13 15:13:43.204550313 +0000 UTC m=+4658.206138562" observedRunningTime="2026-03-13 15:13:45.48025744 +0000 UTC m=+4660.481845689" watchObservedRunningTime="2026-03-13 15:13:45.485371067 +0000 UTC m=+4660.486959306" Mar 13 15:13:50 crc kubenswrapper[4898]: I0313 15:13:50.739602 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:13:51 crc kubenswrapper[4898]: I0313 15:13:51.548352 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"bf9fa5bb76f8bd5a010d026caf62189a87b342669ddb0345c62f785750fd30c1"} Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.158409 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556914-52q7k"] Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.162887 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556914-52q7k" Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.166614 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.166708 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.167207 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.176552 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556914-52q7k"] Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.308225 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5pcz\" (UniqueName: \"kubernetes.io/projected/ad898ac1-9e95-4eb8-a88b-927e3d6364f6-kube-api-access-c5pcz\") pod \"auto-csr-approver-29556914-52q7k\" (UID: \"ad898ac1-9e95-4eb8-a88b-927e3d6364f6\") " pod="openshift-infra/auto-csr-approver-29556914-52q7k" Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.411861 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5pcz\" (UniqueName: \"kubernetes.io/projected/ad898ac1-9e95-4eb8-a88b-927e3d6364f6-kube-api-access-c5pcz\") pod \"auto-csr-approver-29556914-52q7k\" (UID: \"ad898ac1-9e95-4eb8-a88b-927e3d6364f6\") " pod="openshift-infra/auto-csr-approver-29556914-52q7k" Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.433682 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5pcz\" (UniqueName: \"kubernetes.io/projected/ad898ac1-9e95-4eb8-a88b-927e3d6364f6-kube-api-access-c5pcz\") pod \"auto-csr-approver-29556914-52q7k\" (UID: \"ad898ac1-9e95-4eb8-a88b-927e3d6364f6\") " pod="openshift-infra/auto-csr-approver-29556914-52q7k" Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.500362 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556914-52q7k" Mar 13 15:14:01 crc kubenswrapper[4898]: I0313 15:14:01.057075 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556914-52q7k"] Mar 13 15:14:01 crc kubenswrapper[4898]: I0313 15:14:01.687056 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556914-52q7k" event={"ID":"ad898ac1-9e95-4eb8-a88b-927e3d6364f6","Type":"ContainerStarted","Data":"b6828d627192e989a1b2a59091f7130b2e5b82359ed7df4b9b0ac989f18cc295"} Mar 13 15:14:04 crc kubenswrapper[4898]: I0313 15:14:04.723434 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556914-52q7k" event={"ID":"ad898ac1-9e95-4eb8-a88b-927e3d6364f6","Type":"ContainerStarted","Data":"7e1660e6d2126df6f52c14a1146a22533f711db09e61120239f48e0f06547dd2"} Mar 13 15:14:04 crc kubenswrapper[4898]: I0313 15:14:04.743831 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556914-52q7k" podStartSLOduration=3.502775568 podStartE2EDuration="4.743812719s" podCreationTimestamp="2026-03-13 15:14:00 +0000 UTC" firstStartedPulling="2026-03-13 15:14:01.061820743 +0000 UTC m=+4676.063408982" lastFinishedPulling="2026-03-13 15:14:02.302857894 +0000 UTC m=+4677.304446133" observedRunningTime="2026-03-13 15:14:04.742123111 +0000 UTC m=+4679.743711370" watchObservedRunningTime="2026-03-13 15:14:04.743812719 +0000 UTC m=+4679.745400958" Mar 13 15:14:05 crc kubenswrapper[4898]: I0313 15:14:05.738251 4898 generic.go:334] "Generic (PLEG): container finished" podID="ad898ac1-9e95-4eb8-a88b-927e3d6364f6" containerID="7e1660e6d2126df6f52c14a1146a22533f711db09e61120239f48e0f06547dd2" exitCode=0 Mar 13 15:14:05 crc kubenswrapper[4898]: I0313 15:14:05.738303 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556914-52q7k" event={"ID":"ad898ac1-9e95-4eb8-a88b-927e3d6364f6","Type":"ContainerDied","Data":"7e1660e6d2126df6f52c14a1146a22533f711db09e61120239f48e0f06547dd2"} Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.196472 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556914-52q7k" Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.288292 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5pcz\" (UniqueName: \"kubernetes.io/projected/ad898ac1-9e95-4eb8-a88b-927e3d6364f6-kube-api-access-c5pcz\") pod \"ad898ac1-9e95-4eb8-a88b-927e3d6364f6\" (UID: \"ad898ac1-9e95-4eb8-a88b-927e3d6364f6\") " Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.298952 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad898ac1-9e95-4eb8-a88b-927e3d6364f6-kube-api-access-c5pcz" (OuterVolumeSpecName: "kube-api-access-c5pcz") pod "ad898ac1-9e95-4eb8-a88b-927e3d6364f6" (UID: "ad898ac1-9e95-4eb8-a88b-927e3d6364f6"). InnerVolumeSpecName "kube-api-access-c5pcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.391686 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5pcz\" (UniqueName: \"kubernetes.io/projected/ad898ac1-9e95-4eb8-a88b-927e3d6364f6-kube-api-access-c5pcz\") on node \"crc\" DevicePath \"\"" Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.762423 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556914-52q7k" event={"ID":"ad898ac1-9e95-4eb8-a88b-927e3d6364f6","Type":"ContainerDied","Data":"b6828d627192e989a1b2a59091f7130b2e5b82359ed7df4b9b0ac989f18cc295"} Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.762736 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6828d627192e989a1b2a59091f7130b2e5b82359ed7df4b9b0ac989f18cc295" Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.762510 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556914-52q7k" Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.818617 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556908-vplt7"] Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.828873 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556908-vplt7"] Mar 13 15:14:09 crc kubenswrapper[4898]: I0313 15:14:09.778027 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df9e42bc-c4a2-4ccc-ad85-5ca077abfd88" path="/var/lib/kubelet/pods/df9e42bc-c4a2-4ccc-ad85-5ca077abfd88/volumes" Mar 13 15:14:46 crc kubenswrapper[4898]: I0313 15:14:46.064668 4898 scope.go:117] "RemoveContainer" containerID="020c072c01677481578a21e99a6c39f8522847520765e8695da295955dd3e290" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.487638 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c"] Mar 13 15:15:00 crc kubenswrapper[4898]: E0313 15:15:00.492204 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad898ac1-9e95-4eb8-a88b-927e3d6364f6" containerName="oc" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.492233 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad898ac1-9e95-4eb8-a88b-927e3d6364f6" containerName="oc" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.494383 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad898ac1-9e95-4eb8-a88b-927e3d6364f6" containerName="oc" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.501144 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.517197 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.517203 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.603747 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c"] Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.678919 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-secret-volume\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.679050 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc9xx\" (UniqueName: \"kubernetes.io/projected/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-kube-api-access-mc9xx\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.679083 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-config-volume\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.804990 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-secret-volume\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.805234 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc9xx\" (UniqueName: \"kubernetes.io/projected/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-kube-api-access-mc9xx\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.805284 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-config-volume\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.825140 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-config-volume\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.845588 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-secret-volume\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.846954 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc9xx\" (UniqueName: \"kubernetes.io/projected/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-kube-api-access-mc9xx\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.870870 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:02 crc kubenswrapper[4898]: I0313 15:15:02.355942 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c"] Mar 13 15:15:02 crc kubenswrapper[4898]: W0313 15:15:02.401929 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ed7bf25_9e0b_4f13_9ff6_797cd1e6eb34.slice/crio-52306737f5569e6320dc7aa267aaca043c1d2a750fcb4919faad76c0c1ac15fe WatchSource:0}: Error finding container 52306737f5569e6320dc7aa267aaca043c1d2a750fcb4919faad76c0c1ac15fe: Status 404 returned error can't find the container with id 52306737f5569e6320dc7aa267aaca043c1d2a750fcb4919faad76c0c1ac15fe Mar 13 15:15:03 crc kubenswrapper[4898]: I0313 15:15:03.397944 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" event={"ID":"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34","Type":"ContainerStarted","Data":"ba0567c7d801e11f8ad2230e5cbed9394ee6b7a10fbe11e35e978ddc27d03fd5"} Mar 13 15:15:03 crc kubenswrapper[4898]: I0313 15:15:03.398304 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" event={"ID":"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34","Type":"ContainerStarted","Data":"52306737f5569e6320dc7aa267aaca043c1d2a750fcb4919faad76c0c1ac15fe"} Mar 13 15:15:03 crc kubenswrapper[4898]: I0313 15:15:03.433397 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" podStartSLOduration=3.431864052 podStartE2EDuration="3.431864052s" podCreationTimestamp="2026-03-13 15:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:15:03.414885976 +0000 UTC m=+4738.416474235" watchObservedRunningTime="2026-03-13 15:15:03.431864052 +0000 UTC m=+4738.433452291" Mar 13 15:15:04 crc kubenswrapper[4898]: I0313 15:15:04.410991 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" event={"ID":"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34","Type":"ContainerDied","Data":"ba0567c7d801e11f8ad2230e5cbed9394ee6b7a10fbe11e35e978ddc27d03fd5"} Mar 13 15:15:04 crc kubenswrapper[4898]: I0313 15:15:04.410852 4898 generic.go:334] "Generic (PLEG): container finished" podID="9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34" containerID="ba0567c7d801e11f8ad2230e5cbed9394ee6b7a10fbe11e35e978ddc27d03fd5" exitCode=0 Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.005026 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.111849 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-secret-volume\") pod \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.111958 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-config-volume\") pod \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.112230 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc9xx\" (UniqueName: \"kubernetes.io/projected/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-kube-api-access-mc9xx\") pod \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.119025 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-config-volume" (OuterVolumeSpecName: "config-volume") pod "9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34" (UID: "9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.140968 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34" (UID: "9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.141132 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-kube-api-access-mc9xx" (OuterVolumeSpecName: "kube-api-access-mc9xx") pod "9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34" (UID: "9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34"). InnerVolumeSpecName "kube-api-access-mc9xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.215831 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.215874 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.215887 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc9xx\" (UniqueName: \"kubernetes.io/projected/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-kube-api-access-mc9xx\") on node \"crc\" DevicePath \"\"" Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.449216 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" event={"ID":"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34","Type":"ContainerDied","Data":"52306737f5569e6320dc7aa267aaca043c1d2a750fcb4919faad76c0c1ac15fe"} Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.449272 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52306737f5569e6320dc7aa267aaca043c1d2a750fcb4919faad76c0c1ac15fe" Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.449333 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:08 crc kubenswrapper[4898]: I0313 15:15:08.145876 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p"] Mar 13 15:15:08 crc kubenswrapper[4898]: I0313 15:15:08.158211 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p"] Mar 13 15:15:09 crc kubenswrapper[4898]: I0313 15:15:09.763629 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19" path="/var/lib/kubelet/pods/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19/volumes" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.210958 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dnh48"] Mar 13 15:15:23 crc kubenswrapper[4898]: E0313 15:15:23.216351 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34" containerName="collect-profiles" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.216961 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34" containerName="collect-profiles" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.221928 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34" containerName="collect-profiles" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.230997 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.362522 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dnh48"] Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.416782 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-catalog-content\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.417386 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnffg\" (UniqueName: \"kubernetes.io/projected/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-kube-api-access-mnffg\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.417606 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-utilities\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.519955 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-catalog-content\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.520095 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnffg\" (UniqueName: \"kubernetes.io/projected/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-kube-api-access-mnffg\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.520163 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-utilities\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.532096 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-utilities\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.533257 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-catalog-content\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.588350 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnffg\" (UniqueName: \"kubernetes.io/projected/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-kube-api-access-mnffg\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.882074 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:25 crc kubenswrapper[4898]: I0313 15:15:25.441732 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dnh48"] Mar 13 15:15:25 crc kubenswrapper[4898]: I0313 15:15:25.650282 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnh48" event={"ID":"cb46c8b0-a6a9-4b6d-86a1-8408793887e5","Type":"ContainerStarted","Data":"ef80ef6757d050f773b7a3c8ba863e9ba495da2f0964e3cb0f243834ede62c6d"} Mar 13 15:15:26 crc kubenswrapper[4898]: I0313 15:15:26.665647 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnh48" event={"ID":"cb46c8b0-a6a9-4b6d-86a1-8408793887e5","Type":"ContainerDied","Data":"044e1dbede2644956e1b9f4f9606342f16b9eba6e865673613710b9380c20e93"} Mar 13 15:15:26 crc kubenswrapper[4898]: I0313 15:15:26.667522 4898 generic.go:334] "Generic (PLEG): container finished" podID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerID="044e1dbede2644956e1b9f4f9606342f16b9eba6e865673613710b9380c20e93" exitCode=0 Mar 13 15:15:26 crc kubenswrapper[4898]: I0313 15:15:26.675025 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:15:28 crc kubenswrapper[4898]: E0313 15:15:28.145339 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/10/108b2a0c594729551b6547de5641f59a94e5b0352f4a4d63f0b44f8449d70766?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T151527Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=b09ad762917e163a6364894c5fc8dcbe38c3902f85afae09b73a1acab6f6d5c9®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=redhat----redhat-operator-index&akamai_signature=exp=1773415827~hmac=a37d420feaa2bf0534a876f9537b59caa67bd4af10f90723d14594be2e64147a\": remote error: tls: internal error" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 13 15:15:28 crc kubenswrapper[4898]: E0313 15:15:28.147766 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mnffg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-dnh48_openshift-marketplace(cb46c8b0-a6a9-4b6d-86a1-8408793887e5): ErrImagePull: copying system image from manifest list: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/10/108b2a0c594729551b6547de5641f59a94e5b0352f4a4d63f0b44f8449d70766?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T151527Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=b09ad762917e163a6364894c5fc8dcbe38c3902f85afae09b73a1acab6f6d5c9®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=redhat----redhat-operator-index&akamai_signature=exp=1773415827~hmac=a37d420feaa2bf0534a876f9537b59caa67bd4af10f90723d14594be2e64147a\": remote error: tls: internal error" logger="UnhandledError" Mar 13 15:15:28 crc kubenswrapper[4898]: E0313 15:15:28.149552 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/10/108b2a0c594729551b6547de5641f59a94e5b0352f4a4d63f0b44f8449d70766?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T151527Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=b09ad762917e163a6364894c5fc8dcbe38c3902f85afae09b73a1acab6f6d5c9®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=redhat----redhat-operator-index&akamai_signature=exp=1773415827~hmac=a37d420feaa2bf0534a876f9537b59caa67bd4af10f90723d14594be2e64147a\\\": remote error: tls: internal error\"" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" Mar 13 15:15:28 crc kubenswrapper[4898]: E0313 15:15:28.687456 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.724824 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jndb5"] Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.732768 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.793049 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-catalog-content\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.793202 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jdgf\" (UniqueName: \"kubernetes.io/projected/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-kube-api-access-9jdgf\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.793387 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-utilities\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.826200 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jndb5"] Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.895239 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jdgf\" (UniqueName: \"kubernetes.io/projected/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-kube-api-access-9jdgf\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.895428 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-utilities\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.895552 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-catalog-content\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.905019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-catalog-content\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.907758 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-utilities\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.942742 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jdgf\" (UniqueName: \"kubernetes.io/projected/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-kube-api-access-9jdgf\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:30 crc kubenswrapper[4898]: I0313 15:15:30.069401 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:30 crc kubenswrapper[4898]: I0313 15:15:30.818176 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jndb5"] Mar 13 15:15:31 crc kubenswrapper[4898]: I0313 15:15:31.720557 4898 generic.go:334] "Generic (PLEG): container finished" podID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerID="8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22" exitCode=0 Mar 13 15:15:31 crc kubenswrapper[4898]: I0313 15:15:31.720778 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jndb5" event={"ID":"2157e8bf-88a5-4e48-b621-1744dcf0fcdb","Type":"ContainerDied","Data":"8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22"} Mar 13 15:15:31 crc kubenswrapper[4898]: I0313 15:15:31.720979 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jndb5" event={"ID":"2157e8bf-88a5-4e48-b621-1744dcf0fcdb","Type":"ContainerStarted","Data":"ad1f1863b9e188c98c1dfb8a88d0df8d02f680dd682644703708970a9e0dc172"} Mar 13 15:15:33 crc kubenswrapper[4898]: I0313 15:15:33.758999 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jndb5" event={"ID":"2157e8bf-88a5-4e48-b621-1744dcf0fcdb","Type":"ContainerStarted","Data":"3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43"} Mar 13 15:15:38 crc kubenswrapper[4898]: I0313 15:15:38.536883 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:15:38 crc kubenswrapper[4898]: I0313 15:15:38.541980 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:38 crc kubenswrapper[4898]: I0313 15:15:38.622178 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:15:38 crc kubenswrapper[4898]: I0313 15:15:38.622219 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:38 crc kubenswrapper[4898]: I0313 15:15:38.726078 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-g5gqr" podUID="edfd91ee-1246-43b2-84a0-95ea069de402" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:38 crc kubenswrapper[4898]: I0313 15:15:38.726087 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-g5gqr" podUID="edfd91ee-1246-43b2-84a0-95ea069de402" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:41 crc kubenswrapper[4898]: I0313 15:15:41.885970 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jndb5" event={"ID":"2157e8bf-88a5-4e48-b621-1744dcf0fcdb","Type":"ContainerDied","Data":"3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43"} Mar 13 15:15:41 crc kubenswrapper[4898]: I0313 15:15:41.883355 4898 generic.go:334] "Generic (PLEG): container finished" podID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerID="3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43" exitCode=0 Mar 13 15:15:43 crc kubenswrapper[4898]: I0313 15:15:43.938375 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jndb5" event={"ID":"2157e8bf-88a5-4e48-b621-1744dcf0fcdb","Type":"ContainerStarted","Data":"d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd"} Mar 13 15:15:43 crc kubenswrapper[4898]: I0313 15:15:43.992828 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jndb5" podStartSLOduration=3.705539478 podStartE2EDuration="14.979633626s" podCreationTimestamp="2026-03-13 15:15:29 +0000 UTC" firstStartedPulling="2026-03-13 15:15:31.722142087 +0000 UTC m=+4766.723730326" lastFinishedPulling="2026-03-13 15:15:42.996236235 +0000 UTC m=+4777.997824474" observedRunningTime="2026-03-13 15:15:43.975147514 +0000 UTC m=+4778.976735763" watchObservedRunningTime="2026-03-13 15:15:43.979633626 +0000 UTC m=+4778.981221865" Mar 13 15:15:46 crc kubenswrapper[4898]: I0313 15:15:46.038986 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnh48" event={"ID":"cb46c8b0-a6a9-4b6d-86a1-8408793887e5","Type":"ContainerStarted","Data":"aebe307d7887ba7796f4fad329402cc88b0d4332f82fab0c321325f23bf1adea"} Mar 13 15:15:46 crc kubenswrapper[4898]: I0313 15:15:46.351740 4898 scope.go:117] "RemoveContainer" containerID="64e5f9a40c7865406038ca1466eaa2acb4b0149c1cbab8a03ef694cc78da5ccc" Mar 13 15:15:47 crc kubenswrapper[4898]: I0313 15:15:47.101833 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:50 crc kubenswrapper[4898]: I0313 15:15:50.087835 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:50 crc kubenswrapper[4898]: I0313 15:15:50.088430 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:50 crc kubenswrapper[4898]: I0313 15:15:50.624574 4898 patch_prober.go:28] interesting pod/monitoring-plugin-595dc77696-pft4c container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:15:50 crc kubenswrapper[4898]: I0313 15:15:50.628820 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" podUID="10c7ab08-2341-4e85-ad67-8495e038afa2" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:51 crc kubenswrapper[4898]: I0313 15:15:51.434131 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:15:51 crc kubenswrapper[4898]: I0313 15:15:51.434486 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:51 crc kubenswrapper[4898]: I0313 15:15:51.434158 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:15:51 crc kubenswrapper[4898]: I0313 15:15:51.434559 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:52 crc kubenswrapper[4898]: I0313 15:15:52.169838 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jndb5" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" probeResult="failure" output=< Mar 13 15:15:52 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:15:52 crc kubenswrapper[4898]: > Mar 13 15:15:56 crc kubenswrapper[4898]: I0313 15:15:56.486105 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:15:56 crc kubenswrapper[4898]: I0313 15:15:56.486619 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:56 crc kubenswrapper[4898]: I0313 15:15:56.486174 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:15:56 crc kubenswrapper[4898]: I0313 15:15:56.486767 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:57 crc kubenswrapper[4898]: I0313 15:15:57.098656 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:59 crc kubenswrapper[4898]: I0313 15:15:59.831288 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:15:59 crc kubenswrapper[4898]: I0313 15:15:59.832001 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.123133 4898 patch_prober.go:28] interesting pod/metrics-server-7b77fdd7dd-vwwfr container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.123220 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" podUID="823ccfb8-89eb-409e-9c6c-579bacb35ea1" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.252086 4898 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-ljrtz container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.252123 4898 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-ljrtz container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.252226 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.252247 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.353881 4898 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-nkt76 container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.28:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.354250 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podUID="79ead8ee-67ba-4831-b5d4-a1f128e94334" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.28:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.353924 4898 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-nkt76 container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.28:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.354385 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podUID="79ead8ee-67ba-4831-b5d4-a1f128e94334" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.28:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.498135 4898 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-v9lxv container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.498197 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" podUID="6f12557e-02f5-4445-988f-b19f16672e3b" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.692188 4898 patch_prober.go:28] interesting pod/monitoring-plugin-595dc77696-pft4c container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.692191 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.692258 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" podUID="10c7ab08-2341-4e85-ad67-8495e038afa2" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.692322 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.692228 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.692441 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.831572 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.831689 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.404646 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qtdtw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.404731 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.404646 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qtdtw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.404816 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.433994 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.434074 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.434009 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.434210 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.434230 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.434284 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.434321 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.434277 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.834479 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="02f7d483-aecb-4a39-babc-6d9598090c4b" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.859014 4898 patch_prober.go:28] interesting pod/thanos-querier-7467c7fcf7-hsxhp container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.859072 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" podUID="05b901e7-b9fc-4403-bcc2-8eeb2731c66f" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:02 crc kubenswrapper[4898]: I0313 15:16:02.255462 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" podUID="7c1fa9c0-bb2e-4806-95fd-07fba426bdc8" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:02 crc kubenswrapper[4898]: I0313 15:16:02.255876 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" podUID="7c1fa9c0-bb2e-4806-95fd-07fba426bdc8" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:02 crc kubenswrapper[4898]: I0313 15:16:02.339721 4898 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-m8j8d container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.70:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:02 crc kubenswrapper[4898]: I0313 15:16:02.339792 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" podUID="a9193e72-6911-4df4-8b26-04b2537f68a9" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.70:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:02 crc kubenswrapper[4898]: I0313 15:16:02.740956 4898 patch_prober.go:28] interesting pod/oauth-openshift-6757584b5b-nct75 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:02 crc kubenswrapper[4898]: I0313 15:16:02.741394 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" podUID="3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:02 crc kubenswrapper[4898]: I0313 15:16:02.741075 4898 patch_prober.go:28] interesting pod/oauth-openshift-6757584b5b-nct75 container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:02 crc kubenswrapper[4898]: I0313 15:16:02.741501 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" podUID="3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.532501 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.533635 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.602295 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-hkbng" podUID="89abe4ad-dd62-4a70-a1d1-fdf97448ada5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:03 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:03 crc kubenswrapper[4898]: > Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.602344 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-zs42q" podUID="0182307e-bc7f-415e-a0f9-0eff9902384c" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:03 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:03 crc kubenswrapper[4898]: > Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.602478 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jndb5" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:03 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:03 crc kubenswrapper[4898]: > Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.602481 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-fcfmz" podUID="2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:03 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:03 crc kubenswrapper[4898]: > Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.602518 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-hkbng" podUID="89abe4ad-dd62-4a70-a1d1-fdf97448ada5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:03 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:03 crc kubenswrapper[4898]: > Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.602976 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-zs42q" podUID="0182307e-bc7f-415e-a0f9-0eff9902384c" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:03 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:03 crc kubenswrapper[4898]: > Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.605294 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-fcfmz" podUID="2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:03 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:03 crc kubenswrapper[4898]: > Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.622186 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.622265 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.830687 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.830712 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.002742 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="d555bd54-f4d5-4b06-9517-32b4fe687f4b" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.177:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.004040 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="d555bd54-f4d5-4b06-9517-32b4fe687f4b" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.177:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.285069 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" podUID="45efd8ce-26db-4511-bd88-2e7467d02bbb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.285195 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" podUID="3c955ebc-98fd-4921-9923-6151a50e8eec" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.361927 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-zgjzn" podUID="cbb51f06-0778-4b18-82b5-c5ce91e0a613" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:04 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:04 crc kubenswrapper[4898]: > Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.367130 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" podUID="0d88a5d2-a852-409e-b4bd-939d1c2b9090" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.367189 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" podUID="fb7b2f97-fca8-41d2-9be7-d40fac94c171" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.368061 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-zgjzn" podUID="cbb51f06-0778-4b18-82b5-c5ce91e0a613" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:04 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:04 crc kubenswrapper[4898]: > Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.449091 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" podUID="0d88a5d2-a852-409e-b4bd-939d1c2b9090" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.449108 4898 patch_prober.go:28] interesting pod/loki-operator-controller-manager-5fb555ff84-j52b8 container/manager namespace/openshift-operators-redhat: Liveness probe status=failure output="Get \"http://10.217.0.50:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.449195 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" podUID="2cd05b5b-32da-4560-a761-72221b99e2c6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.534093 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" podUID="45efd8ce-26db-4511-bd88-2e7467d02bbb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.534180 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" podUID="ea0ad033-9a48-4e42-a237-f27cacf03adc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.617117 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podUID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.617335 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" podUID="fb7b2f97-fca8-41d2-9be7-d40fac94c171" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.617374 4898 patch_prober.go:28] interesting pod/loki-operator-controller-manager-5fb555ff84-j52b8 container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.617400 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" podUID="2cd05b5b-32da-4560-a761-72221b99e2c6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.617438 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" podUID="3c955ebc-98fd-4921-9923-6151a50e8eec" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.700211 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" podUID="ea0ad033-9a48-4e42-a237-f27cacf03adc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.831361 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" podUID="4a0b9ad6-156f-418b-8eae-1d762f8161dd" containerName="ovnkube-controller" probeResult="failure" output="command timed out" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.907597 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" podUID="1df4a7d6-b0c2-4b00-b591-1a612bd319b6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031163 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podUID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031171 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" podUID="d71982c0-a3d0-4da8-84cd-7494301f589f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031384 4898 patch_prober.go:28] interesting pod/console-699d95d586-ds75f container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031530 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" podUID="ba56f415-73d5-4301-a25d-0e5d1ba4e3b1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031955 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-699d95d586-ds75f" podUID="ab8664f8-1960-4442-9fdd-9711ec963e1f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031492 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" podUID="ba56f415-73d5-4301-a25d-0e5d1ba4e3b1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031574 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" podUID="32b5ebfd-38d9-456e-bb21-7332323239d1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031653 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" podUID="d71982c0-a3d0-4da8-84cd-7494301f589f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031683 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" podUID="1df4a7d6-b0c2-4b00-b591-1a612bd319b6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031550 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" podUID="32b5ebfd-38d9-456e-bb21-7332323239d1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.195154 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" podUID="d24bb749-0b71-456b-80e4-fdf6dd23ba30" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.195243 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" podUID="da3795a7-363f-4637-afe2-77cb77248f9a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.360230 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" podUID="d24bb749-0b71-456b-80e4-fdf6dd23ba30" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.360240 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" podUID="66a86c31-9ff3-439a-a0f8-96c981014b6f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.401142 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" podUID="9ff6f89a-7110-42fb-96b9-8611f280bebe" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.524131 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" podUID="19a0f4de-5258-4f2b-9587-71293459378e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.524313 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podUID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.524366 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" podUID="41000ce4-1a84-44de-b283-1fe0350b1c17" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607171 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podUID="919747b8-a031-4654-999f-3c3928f981b4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607181 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podUID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607211 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" podUID="da3795a7-363f-4637-afe2-77cb77248f9a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607432 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" podUID="0d7c657b-a701-41fe-9b23-d5bba3302c4f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607632 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" podUID="0d7c657b-a701-41fe-9b23-d5bba3302c4f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607751 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" podUID="66a86c31-9ff3-439a-a0f8-96c981014b6f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607867 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" podUID="9ff6f89a-7110-42fb-96b9-8611f280bebe" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607943 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" podUID="19a0f4de-5258-4f2b-9587-71293459378e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607991 4898 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.608602 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.608332 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podUID="919747b8-a031-4654-999f-3c3928f981b4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:05.692089 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" podUID="7bae49ab-1146-43a2-b436-69838c923f1a" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:05.692336 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" podUID="7bae49ab-1146-43a2-b436-69838c923f1a" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.104:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:05.843204 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" podUID="e000d86e-e7a8-49ed-9184-fdd67dfe797d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.138098 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" podUID="c35de09d-7f21-47d3-aac5-a26b15b0a496" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.245178 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" podUID="34b4f98c-a87c-4a97-9ac4-286afeb9e4bc" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327157 4898 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7zpw2 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327202 4898 patch_prober.go:28] interesting pod/controller-manager-797d9c85c-m5jdj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327251 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" podUID="2ec8c09e-475e-4c4b-86ec-38388754240f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327253 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" podUID="9c7e70de-de85-421c-aaeb-476450d8e0ee" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327297 4898 patch_prober.go:28] interesting pod/controller-manager-797d9c85c-m5jdj container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327152 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" podUID="34b4f98c-a87c-4a97-9ac4-286afeb9e4bc" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327315 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" podUID="2ec8c09e-475e-4c4b-86ec-38388754240f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327358 4898 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7zpw2 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327374 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" podUID="9c7e70de-de85-421c-aaeb-476450d8e0ee" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327472 4898 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-4p7wt container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327560 4898 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-4p7wt container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327556 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" podUID="802396a8-633d-4f86-b77b-c25e9c76cc7a" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327625 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" podUID="802396a8-633d-4f86-b77b-c25e9c76cc7a" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.403671 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.403722 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.403776 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.403789 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.609409 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" podUID="41000ce4-1a84-44de-b283-1fe0350b1c17" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.834338 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="02f7d483-aecb-4a39-babc-6d9598090c4b" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.850220 4898 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.850308 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:06.858609 4898 patch_prober.go:28] interesting pod/thanos-querier-7467c7fcf7-hsxhp container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:06.858655 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" podUID="05b901e7-b9fc-4403-bcc2-8eeb2731c66f" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:06.858713 4898 patch_prober.go:28] interesting pod/thanos-querier-7467c7fcf7-hsxhp container/kube-rbac-proxy-web namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.87:9091/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:06.858727 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" podUID="05b901e7-b9fc-4403-bcc2-8eeb2731c66f" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.87:9091/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.031119 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" podUID="3a26728d-85c2-465c-bce4-c74045ea9e0d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.263048 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.263132 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" podUID="604b9c21-3e85-4c2e-9faf-962f44236911" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.263164 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.387080 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" podUID="7c1fa9c0-bb2e-4806-95fd-07fba426bdc8" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.387114 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-7bb4cc7c98-cx422" podUID="b231c7db-5056-4ec6-a64c-0aa8bdff336b" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.99:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.387156 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.387124 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-7bb4cc7c98-cx422" podUID="b231c7db-5056-4ec6-a64c-0aa8bdff336b" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.99:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.387095 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" podUID="604b9c21-3e85-4c2e-9faf-962f44236911" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.455825 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-bqmxg" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.520781 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr" containerStatusID={"Type":"cri-o","ID":"bea18a0b4bb5bbad650ae6e8e67ba0820f4376de6521e05324b889bd9cd86809"} pod="metallb-system/frr-k8s-bqmxg" containerMessage="Container frr failed liveness probe, will be restarted" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.522214 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="frr" containerID="cri-o://bea18a0b4bb5bbad650ae6e8e67ba0820f4376de6521e05324b889bd9cd86809" gracePeriod=2 Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.607049 4898 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-vvj56 container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.607098 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" podUID="510657b4-32e2-4fa5-9c09-17869a295736" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.611975 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" podUID="41000ce4-1a84-44de-b283-1fe0350b1c17" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.738309 4898 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-qr6bw container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.738420 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" podUID="5e81d88f-c63b-4f0c-ba17-f1171350c28d" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.808842 4898 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-mwqzz container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.808912 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" podUID="e519fed6-a687-4a01-a979-598e81122ad1" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.830745 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-fpgr7" podUID="e4761153-ed4e-4264-8f21-b4de31a4bbb8" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.531677 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.532062 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.531742 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.532427 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.607574 4898 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-vvj56 container/loki-distributor namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.607724 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" podUID="510657b4-32e2-4fa5-9c09-17869a295736" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.621864 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": context deadline exceeded" start-of-body= Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.621951 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": context deadline exceeded" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.622227 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.622313 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.726030 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-g5gqr" podUID="edfd91ee-1246-43b2-84a0-95ea069de402" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.726043 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-g5gqr" podUID="edfd91ee-1246-43b2-84a0-95ea069de402" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.738515 4898 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-qr6bw container/loki-querier namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.53:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.738570 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" podUID="5e81d88f-c63b-4f0c-ba17-f1171350c28d" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.808257 4898 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-mwqzz container/loki-query-frontend namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.808351 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" podUID="e519fed6-a687-4a01-a979-598e81122ad1" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.835110 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.835390 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" podUID="4a0b9ad6-156f-418b-8eae-1d762f8161dd" containerName="sbdb" probeResult="failure" output="command timed out" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.835443 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.835479 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" podUID="4a0b9ad6-156f-418b-8eae-1d762f8161dd" containerName="nbdb" probeResult="failure" output="command timed out" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.942966 4898 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.943046 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="6a1df267-1145-4fe1-9455-57df3d043e3a" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.992858 4898 trace.go:236] Trace[1350076615]: "Calculate volume metrics of catalog-content for pod openshift-marketplace/redhat-operators-dnh48" (13-Mar-2026 15:16:07.273) (total time: 1713ms): Mar 13 15:16:08 crc kubenswrapper[4898]: Trace[1350076615]: [1.71358368s] [1.71358368s] END Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.993819 4898 trace.go:236] Trace[1179237134]: "Calculate volume metrics of catalog-content for pod openshift-marketplace/community-operators-jndb5" (13-Mar-2026 15:16:07.266) (total time: 1711ms): Mar 13 15:16:08 crc kubenswrapper[4898]: Trace[1179237134]: [1.711972973s] [1.711972973s] END Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.003122 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="d555bd54-f4d5-4b06-9517-32b4fe687f4b" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.177:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.003141 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="d555bd54-f4d5-4b06-9517-32b4fe687f4b" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.177:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.033294 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.033365 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="2194d847-4858-4f46-ab8b-c2d78cf5677e" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.089419 4898 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.089503 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="9c5fee8d-2246-4e34-8ddd-ce710e155d73" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.197034 4898 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7zpw2 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.197203 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" podUID="9c7e70de-de85-421c-aaeb-476450d8e0ee" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.197084 4898 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7zpw2 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.197512 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" podUID="9c7e70de-de85-421c-aaeb-476450d8e0ee" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.345346 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z7ng7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.345440 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" podUID="b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.404118 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z7ng7 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.404374 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" podUID="b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.532338 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.55:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.532402 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.532486 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.55:8081/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.532507 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.622347 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.56:8081/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.622450 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.622358 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.56:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.622554 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.830994 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.831108 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:09.986697 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerDied","Data":"bea18a0b4bb5bbad650ae6e8e67ba0820f4376de6521e05324b889bd9cd86809"} Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:09.989514 4898 generic.go:334] "Generic (PLEG): container finished" podID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerID="bea18a0b4bb5bbad650ae6e8e67ba0820f4376de6521e05324b889bd9cd86809" exitCode=143 Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.082663 4898 patch_prober.go:28] interesting pod/metrics-server-7b77fdd7dd-vwwfr container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.082728 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" podUID="823ccfb8-89eb-409e-9c6c-579bacb35ea1" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.083845 4898 patch_prober.go:28] interesting pod/metrics-server-7b77fdd7dd-vwwfr container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.083907 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" podUID="823ccfb8-89eb-409e-9c6c-579bacb35ea1" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.152062 4898 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-g559r container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.152138 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" podUID="1a01ab05-7178-48c7-892b-b91cf60432f8" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.152614 4898 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-g559r container/oauth-apiserver namespace/openshift-oauth-apiserver: Liveness probe status=failure output="Get \"https://10.217.0.22:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.152671 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" podUID="1a01ab05-7178-48c7-892b-b91cf60432f8" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.22:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.243149 4898 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-ljrtz container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.243179 4898 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-ljrtz container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.243217 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.243241 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.382101 4898 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-nkt76 container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.28:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.382496 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podUID="79ead8ee-67ba-4831-b5d4-a1f128e94334" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.28:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.453703 4898 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-v9lxv container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.453781 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" podUID="6f12557e-02f5-4445-988f-b19f16672e3b" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.570126 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" podUID="0ab852e1-fd26-4f76-b758-77896f8e236b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693501 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693547 4898 patch_prober.go:28] interesting pod/monitoring-plugin-595dc77696-pft4c container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693575 4898 patch_prober.go:28] interesting pod/console-operator-58897d9998-kgnxj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693590 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" podUID="10c7ab08-2341-4e85-ad67-8495e038afa2" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693631 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" podUID="eedd2260-f339-4e2f-83e8-13a56cee2ce6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693565 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693553 4898 patch_prober.go:28] interesting pod/console-operator-58897d9998-kgnxj container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693693 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" podUID="eedd2260-f339-4e2f-83e8-13a56cee2ce6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693743 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693758 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.726102 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.846576 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.846762 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-9k7p6" podUID="478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.847268 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.850343 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-9k7p6" podUID="478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.182135 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.182147 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.182473 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.182521 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.403829 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qtdtw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.403888 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.403958 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qtdtw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.403972 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.433430 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.433627 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.433685 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.433627 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.433731 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.433634 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.433538 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.433948 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.459727 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.459797 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.477214 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="packageserver" containerStatusID={"Type":"cri-o","ID":"6ce16b962cdf6095fe075410dd0e3e4b0df623a44b72871189f0b1d5d8146085"} pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" containerMessage="Container packageserver failed liveness probe, will be restarted" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.489820 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" containerID="cri-o://6ce16b962cdf6095fe075410dd0e3e4b0df623a44b72871189f0b1d5d8146085" gracePeriod=30 Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.768096 4898 patch_prober.go:28] interesting pod/monitoring-plugin-595dc77696-pft4c container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.768458 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" podUID="10c7ab08-2341-4e85-ad67-8495e038afa2" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.809066 4898 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-x4nxn container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.809131 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" podUID="8675f0f0-7d3b-41d9-959e-e73f78f32c5c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.809181 4898 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-x4nxn container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.809250 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" podUID="8675f0f0-7d3b-41d9-959e-e73f78f32c5c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.825808 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.825406021s: [/var/lib/containers/storage/overlay/a1b358f0312b11061f478e8cafc04d37350c0ce47e98958a390f7017efbcafab/diff /var/log/pods/openstack_openstackclient_124bd4ee-d9f0-408f-a46e-4d143e8ab02a/openstackclient/0.log]; will not log again for this container unless duration exceeds 2s Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.835485 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="02f7d483-aecb-4a39-babc-6d9598090c4b" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.858227 4898 patch_prober.go:28] interesting pod/thanos-querier-7467c7fcf7-hsxhp container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.858299 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" podUID="05b901e7-b9fc-4403-bcc2-8eeb2731c66f" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.212043 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" podUID="7c1fa9c0-bb2e-4806-95fd-07fba426bdc8" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.253156 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" podUID="7c1fa9c0-bb2e-4806-95fd-07fba426bdc8" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.253288 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.340464 4898 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-m8j8d container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.70:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.340754 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" podUID="a9193e72-6911-4df4-8b26-04b2537f68a9" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.70:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.461245 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.461306 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.668067 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.743397 4898 patch_prober.go:28] interesting pod/oauth-openshift-6757584b5b-nct75 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.743558 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" podUID="3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.743409 4898 patch_prober.go:28] interesting pod/oauth-openshift-6757584b5b-nct75 container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.743683 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" podUID="3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.888950 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-fpgr7" podUID="e4761153-ed4e-4264-8f21-b4de31a4bbb8" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.889250 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="02f7d483-aecb-4a39-babc-6d9598090c4b" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.889324 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.890859 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"7ce3438fd9d4e3db5da0a65bc2744dcbc537f4c6c98b27b1433e8ed964fd1ed3"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.890958 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02f7d483-aecb-4a39-babc-6d9598090c4b" containerName="ceilometer-central-agent" containerID="cri-o://7ce3438fd9d4e3db5da0a65bc2744dcbc537f4c6c98b27b1433e8ed964fd1ed3" gracePeriod=30 Mar 13 15:16:13 crc kubenswrapper[4898]: I0313 15:16:13.531756 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:13 crc kubenswrapper[4898]: I0313 15:16:13.531831 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:13 crc kubenswrapper[4898]: I0313 15:16:13.621307 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:13 crc kubenswrapper[4898]: I0313 15:16:13.621571 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:13 crc kubenswrapper[4898]: I0313 15:16:13.832020 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:13 crc kubenswrapper[4898]: I0313 15:16:13.832170 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:13 crc kubenswrapper[4898]: I0313 15:16:13.832716 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 15:16:13 crc kubenswrapper[4898]: E0313 15:16:13.978770 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:16:03Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:16:03Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:16:03Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:16:03Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.021531 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="d555bd54-f4d5-4b06-9517-32b4fe687f4b" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.177:9090/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.021656 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.021535 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="d555bd54-f4d5-4b06-9517-32b4fe687f4b" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.177:9090/-/healthy\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.099796 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerStarted","Data":"dff248ea143392c1ca27792bdf6dcdbd888bce77e921d4a7a8ca6f0d73c34304"} Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.158075 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" podUID="3c955ebc-98fd-4921-9923-6151a50e8eec" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.158075 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" podUID="0d88a5d2-a852-409e-b4bd-939d1c2b9090" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.200096 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" podUID="45efd8ce-26db-4511-bd88-2e7467d02bbb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.200154 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" podUID="fb7b2f97-fca8-41d2-9be7-d40fac94c171" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.241142 4898 patch_prober.go:28] interesting pod/loki-operator-controller-manager-5fb555ff84-j52b8 container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.241402 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" podUID="2cd05b5b-32da-4560-a761-72221b99e2c6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.283363 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" podUID="ea0ad033-9a48-4e42-a237-f27cacf03adc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.338116 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podUID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.385869 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.502206 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" podUID="ba56f415-73d5-4301-a25d-0e5d1ba4e3b1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.543217 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" podUID="32b5ebfd-38d9-456e-bb21-7332323239d1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.584115 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" podUID="d71982c0-a3d0-4da8-84cd-7494301f589f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.709100 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" podUID="52959483-daae-423a-a3bf-8e3fa7810074" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.709449 4898 patch_prober.go:28] interesting pod/console-699d95d586-ds75f container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.709484 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" podUID="d24bb749-0b71-456b-80e4-fdf6dd23ba30" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.709522 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" podUID="1df4a7d6-b0c2-4b00-b591-1a612bd319b6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.709518 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-699d95d586-ds75f" podUID="ab8664f8-1960-4442-9fdd-9711ec963e1f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.770455 4898 trace.go:236] Trace[1749974672]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-0" (13-Mar-2026 15:16:08.509) (total time: 6252ms): Mar 13 15:16:14 crc kubenswrapper[4898]: Trace[1749974672]: [6.252274294s] [6.252274294s] END Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.770489 4898 trace.go:236] Trace[718665416]: "Calculate volume metrics of mysql-db for pod openstack/openstack-galera-0" (13-Mar-2026 15:16:11.572) (total time: 3192ms): Mar 13 15:16:14 crc kubenswrapper[4898]: Trace[718665416]: [3.192789928s] [3.192789928s] END Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.773271 4898 trace.go:236] Trace[1920003612]: "Calculate volume metrics of swift for pod openstack/swift-storage-0" (13-Mar-2026 15:16:11.841) (total time: 2924ms): Mar 13 15:16:14 crc kubenswrapper[4898]: Trace[1920003612]: [2.924072815s] [2.924072815s] END Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.773920 4898 trace.go:236] Trace[575714101]: "Calculate volume metrics of glance for pod openstack/glance-default-internal-api-0" (13-Mar-2026 15:16:10.652) (total time: 4108ms): Mar 13 15:16:14 crc kubenswrapper[4898]: Trace[575714101]: [4.108760465s] [4.108760465s] END Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.912947 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="b7452a36-0169-4cfe-9ede-ef4d0ef072d9" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.23:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.913275 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="b7452a36-0169-4cfe-9ede-ef4d0ef072d9" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.23:8080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.059201 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" podUID="0d7c657b-a701-41fe-9b23-d5bba3302c4f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.100169 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podUID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.100181 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" podUID="66a86c31-9ff3-439a-a0f8-96c981014b6f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.142126 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" podUID="da3795a7-363f-4637-afe2-77cb77248f9a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.142146 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" podUID="9ff6f89a-7110-42fb-96b9-8611f280bebe" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.224175 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podUID="919747b8-a031-4654-999f-3c3928f981b4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.224660 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" podUID="19a0f4de-5258-4f2b-9587-71293459378e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.308088 4898 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7zpw2 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.308166 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" podUID="9c7e70de-de85-421c-aaeb-476450d8e0ee" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.309230 4898 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7zpw2 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.309276 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" podUID="9c7e70de-de85-421c-aaeb-476450d8e0ee" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.461385 4898 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-7zcdz container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.73:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.461450 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" podUID="ba480ebb-f079-4888-857b-d917e4a9b13b" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.73:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.462609 4898 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-7zcdz container/registry namespace/openshift-image-registry: Liveness probe status=failure output="Get \"https://10.217.0.73:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.462641 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" podUID="ba480ebb-f079-4888-857b-d917e4a9b13b" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.73:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.489969 4898 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.490030 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.647152 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" podUID="7bae49ab-1146-43a2-b436-69838c923f1a" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.836173 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-zs42q" podUID="0182307e-bc7f-415e-a0f9-0eff9902384c" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.836201 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-hkbng" podUID="89abe4ad-dd62-4a70-a1d1-fdf97448ada5" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.836206 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-fcfmz" podUID="2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.836235 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-fcfmz" podUID="2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.836339 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jndb5" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.836458 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-hkbng" podUID="89abe4ad-dd62-4a70-a1d1-fdf97448ada5" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.836726 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-zs42q" podUID="0182307e-bc7f-415e-a0f9-0eff9902384c" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.843067 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" podUID="e000d86e-e7a8-49ed-9184-fdd67dfe797d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.057466 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-bqmxg" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.122263 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" event={"ID":"7667c5a1-aecb-4ccd-b8fd-e20c2c049472","Type":"ContainerDied","Data":"6ce16b962cdf6095fe075410dd0e3e4b0df623a44b72871189f0b1d5d8146085"} Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.122678 4898 generic.go:334] "Generic (PLEG): container finished" podID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerID="6ce16b962cdf6095fe075410dd0e3e4b0df623a44b72871189f0b1d5d8146085" exitCode=0 Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.178142 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" podUID="c35de09d-7f21-47d3-aac5-a26b15b0a496" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.219090 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" podUID="34b4f98c-a87c-4a97-9ac4-286afeb9e4bc" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.262310 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" podUID="34b4f98c-a87c-4a97-9ac4-286afeb9e4bc" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.262713 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" podUID="c35de09d-7f21-47d3-aac5-a26b15b0a496" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.263194 4898 patch_prober.go:28] interesting pod/controller-manager-797d9c85c-m5jdj container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.263268 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" podUID="2ec8c09e-475e-4c4b-86ec-38388754240f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.270456 4898 patch_prober.go:28] interesting pod/controller-manager-797d9c85c-m5jdj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.270548 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" podUID="2ec8c09e-475e-4c4b-86ec-38388754240f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.316758 4898 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-4p7wt container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.317308 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" podUID="802396a8-633d-4f86-b77b-c25e9c76cc7a" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.316798 4898 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-4p7wt container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.317554 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" podUID="802396a8-633d-4f86-b77b-c25e9c76cc7a" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.405036 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.405100 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.405147 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.405140 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.405204 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.406564 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="route-controller-manager" containerStatusID={"Type":"cri-o","ID":"0292f4db3076b00a4ff8a067c3c4184374170598a35c90418442c7305e929d8e"} pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" containerMessage="Container route-controller-manager failed liveness probe, will be restarted" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.406617 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" containerID="cri-o://0292f4db3076b00a4ff8a067c3c4184374170598a35c90418442c7305e929d8e" gracePeriod=30 Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.849860 4898 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.849998 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.072067 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" podUID="3a26728d-85c2-465c-bce4-c74045ea9e0d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.113046 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.195099 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" podUID="3a26728d-85c2-465c-bce4-c74045ea9e0d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.195190 4898 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.277118 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.277140 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" podUID="604b9c21-3e85-4c2e-9faf-962f44236911" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.359543 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" podUID="604b9c21-3e85-4c2e-9faf-962f44236911" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.359631 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-7bb4cc7c98-cx422" podUID="b231c7db-5056-4ec6-a64c-0aa8bdff336b" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.99:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.359994 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-7bb4cc7c98-cx422" podUID="b231c7db-5056-4ec6-a64c-0aa8bdff336b" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.99:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.607577 4898 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-vvj56 container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.607633 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" podUID="510657b4-32e2-4fa5-9c09-17869a295736" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.738235 4898 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-qr6bw container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.738339 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" podUID="5e81d88f-c63b-4f0c-ba17-f1171350c28d" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.807273 4898 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-mwqzz container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.807342 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" podUID="e519fed6-a687-4a01-a979-598e81122ad1" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.830919 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.834228 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.835998 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-zgjzn" podUID="cbb51f06-0778-4b18-82b5-c5ce91e0a613" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.838299 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-zgjzn" podUID="cbb51f06-0778-4b18-82b5-c5ce91e0a613" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.532615 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.532664 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.535232 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.535106 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.622439 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.623334 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.623402 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.623417 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.726231 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-g5gqr" podUID="edfd91ee-1246-43b2-84a0-95ea069de402" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.726285 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-g5gqr" podUID="edfd91ee-1246-43b2-84a0-95ea069de402" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.942978 4898 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.943102 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="6a1df267-1145-4fe1-9455-57df3d043e3a" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.033070 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:3101/ready\": context deadline exceeded" start-of-body= Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.033115 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="2194d847-4858-4f46-ab8b-c2d78cf5677e" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.57:3101/ready\": context deadline exceeded" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.090298 4898 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.090357 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="9c5fee8d-2246-4e34-8ddd-ce710e155d73" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.133951 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.134017 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.168023 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" event={"ID":"7667c5a1-aecb-4ccd-b8fd-e20c2c049472","Type":"ContainerStarted","Data":"88aa7f357e8dd41207b3d690f070bb48ea1ac7bd413234067776040026c3e1df"} Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.385156 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z7ng7 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.385393 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z7ng7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.385410 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" podUID="b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.385468 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" podUID="b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.830708 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.830840 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.833891 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.833976 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-galera-0" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.085075 4898 patch_prober.go:28] interesting pod/metrics-server-7b77fdd7dd-vwwfr container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.085136 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" podUID="823ccfb8-89eb-409e-9c6c-579bacb35ea1" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.085207 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.094147 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="metrics-server" containerStatusID={"Type":"cri-o","ID":"7046ebdd84c06a54a3ad07946403d79aec9442bfc8bcd95a1021d0c00e48bec6"} pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" containerMessage="Container metrics-server failed liveness probe, will be restarted" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.094203 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" podUID="823ccfb8-89eb-409e-9c6c-579bacb35ea1" containerName="metrics-server" containerID="cri-o://7046ebdd84c06a54a3ad07946403d79aec9442bfc8bcd95a1021d0c00e48bec6" gracePeriod=170 Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.178201 4898 generic.go:334] "Generic (PLEG): container finished" podID="19a0f4de-5258-4f2b-9587-71293459378e" containerID="490b131fcdcfc3c4a0f9eaefa6f19529f359253225ede8f7a9b1add8a23964b5" exitCode=1 Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.178589 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" event={"ID":"19a0f4de-5258-4f2b-9587-71293459378e","Type":"ContainerDied","Data":"490b131fcdcfc3c4a0f9eaefa6f19529f359253225ede8f7a9b1add8a23964b5"} Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.181135 4898 generic.go:334] "Generic (PLEG): container finished" podID="32b5ebfd-38d9-456e-bb21-7332323239d1" containerID="cf80e2daa36139cd410b53ece19d163d97bf6b041aba2184079936d40fe56543" exitCode=1 Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.181298 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" event={"ID":"32b5ebfd-38d9-456e-bb21-7332323239d1","Type":"ContainerDied","Data":"cf80e2daa36139cd410b53ece19d163d97bf6b041aba2184079936d40fe56543"} Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.181875 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c"} pod="openstack/openstack-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.185501 4898 scope.go:117] "RemoveContainer" containerID="490b131fcdcfc3c4a0f9eaefa6f19529f359253225ede8f7a9b1add8a23964b5" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.186561 4898 scope.go:117] "RemoveContainer" containerID="cf80e2daa36139cd410b53ece19d163d97bf6b041aba2184079936d40fe56543" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.243119 4898 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-ljrtz container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.243221 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.243268 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.243497 4898 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-ljrtz container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.243542 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.243711 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.243769 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.243964 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.245189 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="operator" containerStatusID={"Type":"cri-o","ID":"00f83bd17f1c6c8f7be365f8c10dd0d1fa4e2714540181da78b69f463649b54d"} pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" containerMessage="Container operator failed liveness probe, will be restarted" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.245279 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" containerID="cri-o://00f83bd17f1c6c8f7be365f8c10dd0d1fa4e2714540181da78b69f463649b54d" gracePeriod=30 Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.424203 4898 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-nkt76 container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.28:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.424269 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podUID="79ead8ee-67ba-4831-b5d4-a1f128e94334" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.28:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.424321 4898 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-nkt76 container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.28:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.424334 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podUID="79ead8ee-67ba-4831-b5d4-a1f128e94334" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.28:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.424390 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.433097 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.433296 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.433304 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.433346 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.433407 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.452863 4898 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-v9lxv container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.452976 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" podUID="6f12557e-02f5-4445-988f-b19f16672e3b" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.453042 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.465554 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"04b36de235cccfc39b10d95f29d1b8eb5397d41b37472245250afa44100523ba"} pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.465644 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" podUID="6f12557e-02f5-4445-988f-b19f16672e3b" containerName="authentication-operator" containerID="cri-o://04b36de235cccfc39b10d95f29d1b8eb5397d41b37472245250afa44100523ba" gracePeriod=30 Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.483720 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.483745 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.493095 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:54530->192.168.126.11:10257: read: connection reset by peer" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.493116 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:54520->192.168.126.11:10257: read: connection reset by peer" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.493131 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:54530->192.168.126.11:10257: read: connection reset by peer" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.493165 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:54520->192.168.126.11:10257: read: connection reset by peer" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.611096 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" podUID="0ab852e1-fd26-4f76-b758-77896f8e236b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.611171 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" podUID="0ab852e1-fd26-4f76-b758-77896f8e236b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.695963 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696120 4898 patch_prober.go:28] interesting pod/monitoring-plugin-595dc77696-pft4c container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696300 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696346 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696374 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" podUID="10c7ab08-2341-4e85-ad67-8495e038afa2" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696141 4898 patch_prober.go:28] interesting pod/console-operator-58897d9998-kgnxj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696440 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" podUID="eedd2260-f339-4e2f-83e8-13a56cee2ce6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696156 4898 patch_prober.go:28] interesting pod/console-operator-58897d9998-kgnxj container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696468 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" podUID="eedd2260-f339-4e2f-83e8-13a56cee2ce6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696169 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696505 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696560 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.698135 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"c62d0b0db8a023eca0369719b9dd81ab2eadb339e637fa7223f0ac36593bb07b"} pod="openshift-ingress/router-default-5444994796-6plhg" containerMessage="Container router failed liveness probe, will be restarted" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.698207 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" containerID="cri-o://c62d0b0db8a023eca0369719b9dd81ab2eadb339e637fa7223f0ac36593bb07b" gracePeriod=10 Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.830980 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.831103 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.831167 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.831512 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.831580 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.832563 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"5603fe1e36b001c885a705c95d9e330d44a889a46589d0677416ea907bf21af1"} pod="openstack/openstack-cell1-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.838094 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-9k7p6" podUID="478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.838254 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-9k7p6" podUID="478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.182076 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.182375 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.182123 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.182435 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.196778 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/2.log" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.200083 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.202936 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.205104 4898 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f3f7856b47908c691e419ef35da06276497b17de1d07f34af58226c123a50471" exitCode=1 Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.205208 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f3f7856b47908c691e419ef35da06276497b17de1d07f34af58226c123a50471"} Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.205313 4898 scope.go:117] "RemoveContainer" containerID="b65cd02a017de53599582fdc93495c1971ff933ecacdf6af0171bad6070bff66" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.206072 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.206109 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.208572 4898 scope.go:117] "RemoveContainer" containerID="f3f7856b47908c691e419ef35da06276497b17de1d07f34af58226c123a50471" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.412481 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qtdtw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.412537 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.412599 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.417570 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qtdtw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.417633 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.417682 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.466888 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.467193 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.467272 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.466982 4898 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-nkt76 container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.28:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.467816 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podUID="79ead8ee-67ba-4831-b5d4-a1f128e94334" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.28:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.467026 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.467854 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.467877 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.474002 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="olm-operator" containerStatusID={"Type":"cri-o","ID":"533f9c0d5d010c2b5018f71aab66fa3025d2e9a7d36d33fe0f8634b0b2bbb7ec"} pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" containerMessage="Container olm-operator failed liveness probe, will be restarted" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.474084 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" containerID="cri-o://533f9c0d5d010c2b5018f71aab66fa3025d2e9a7d36d33fe0f8634b0b2bbb7ec" gracePeriod=30 Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.804859 4898 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-x4nxn container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.804988 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" podUID="8675f0f0-7d3b-41d9-959e-e73f78f32c5c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.805304 4898 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-x4nxn container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.805362 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" podUID="8675f0f0-7d3b-41d9-959e-e73f78f32c5c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.834583 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.845599 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.098082 4898 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.220567 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/2.log" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.221715 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.226023 4898 generic.go:334] "Generic (PLEG): container finished" podID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerID="00f83bd17f1c6c8f7be365f8c10dd0d1fa4e2714540181da78b69f463649b54d" exitCode=0 Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.226109 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" event={"ID":"3bfc0332-bb59-42bf-bb70-462efa225c81","Type":"ContainerDied","Data":"00f83bd17f1c6c8f7be365f8c10dd0d1fa4e2714540181da78b69f463649b54d"} Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.229650 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" event={"ID":"19a0f4de-5258-4f2b-9587-71293459378e","Type":"ContainerStarted","Data":"a21c77ffcf35fcf200cb633fb54eeac6d8564a2b9803bb1375581cf30c3e1d15"} Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.229883 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.231508 4898 generic.go:334] "Generic (PLEG): container finished" podID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerID="0292f4db3076b00a4ff8a067c3c4184374170598a35c90418442c7305e929d8e" exitCode=0 Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.231564 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" event={"ID":"0376a3d3-f3a2-4674-a7f9-b06a9e62836e","Type":"ContainerDied","Data":"0292f4db3076b00a4ff8a067c3c4184374170598a35c90418442c7305e929d8e"} Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.235586 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" event={"ID":"32b5ebfd-38d9-456e-bb21-7332323239d1","Type":"ContainerStarted","Data":"b1a0ce8e62f99617a982cb589ae1a57ff6616fa5fa8223b227e7b7590e8cd06d"} Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.236157 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.242806 4898 generic.go:334] "Generic (PLEG): container finished" podID="e000d86e-e7a8-49ed-9184-fdd67dfe797d" containerID="b90a231f04c7fa997bdb6c4af06b79391dc9d6f78d2d6a60c3d05ec12274e55f" exitCode=1 Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.243010 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" event={"ID":"e000d86e-e7a8-49ed-9184-fdd67dfe797d","Type":"ContainerDied","Data":"b90a231f04c7fa997bdb6c4af06b79391dc9d6f78d2d6a60c3d05ec12274e55f"} Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.244359 4898 scope.go:117] "RemoveContainer" containerID="b90a231f04c7fa997bdb6c4af06b79391dc9d6f78d2d6a60c3d05ec12274e55f" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.244591 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="catalog-operator" containerStatusID={"Type":"cri-o","ID":"f6246f6fbdbf725301f1e7d37222d66624c9ce34cacd75b24a3bb142ed798d64"} pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" containerMessage="Container catalog-operator failed liveness probe, will be restarted" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.244650 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" containerID="cri-o://f6246f6fbdbf725301f1e7d37222d66624c9ce34cacd75b24a3bb142ed798d64" gracePeriod=30 Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.266317 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.296719 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556916-dvhfq"] Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.308818 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.425580 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6895\" (UniqueName: \"kubernetes.io/projected/bb6b061a-b0db-4b84-bfc7-08238f699132-kube-api-access-d6895\") pod \"auto-csr-approver-29556916-dvhfq\" (UID: \"bb6b061a-b0db-4b84-bfc7-08238f699132\") " pod="openshift-infra/auto-csr-approver-29556916-dvhfq" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.471041 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.482995 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.483419 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.512403 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.512466 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.529871 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6895\" (UniqueName: \"kubernetes.io/projected/bb6b061a-b0db-4b84-bfc7-08238f699132-kube-api-access-d6895\") pod \"auto-csr-approver-29556916-dvhfq\" (UID: \"bb6b061a-b0db-4b84-bfc7-08238f699132\") " pod="openshift-infra/auto-csr-approver-29556916-dvhfq" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.551323 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.809331 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6895\" (UniqueName: \"kubernetes.io/projected/bb6b061a-b0db-4b84-bfc7-08238f699132-kube-api-access-d6895\") pod \"auto-csr-approver-29556916-dvhfq\" (UID: \"bb6b061a-b0db-4b84-bfc7-08238f699132\") " pod="openshift-infra/auto-csr-approver-29556916-dvhfq" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.832804 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.174486 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.208323 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podUID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/healthz\": dial tcp 10.217.0.110:8081: connect: connection refused" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.208323 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podUID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": dial tcp 10.217.0.110:8081: connect: connection refused" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.208444 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.208999 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podUID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": dial tcp 10.217.0.110:8081: connect: connection refused" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.285585 4898 generic.go:334] "Generic (PLEG): container finished" podID="0ab852e1-fd26-4f76-b758-77896f8e236b" containerID="5517ef51ccfa8b2748099ff5aaad46312299246fde947335af38f7f6b1623c69" exitCode=1 Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.285694 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" event={"ID":"0ab852e1-fd26-4f76-b758-77896f8e236b","Type":"ContainerDied","Data":"5517ef51ccfa8b2748099ff5aaad46312299246fde947335af38f7f6b1623c69"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.288043 4898 scope.go:117] "RemoveContainer" containerID="5517ef51ccfa8b2748099ff5aaad46312299246fde947335af38f7f6b1623c69" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.294859 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" event={"ID":"0376a3d3-f3a2-4674-a7f9-b06a9e62836e","Type":"ContainerStarted","Data":"fe9cb4f935782030682843d4f6ff1e103b91fd35cda27d7fe8cbd8f803b88c10"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.295317 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.295477 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": dial tcp 10.217.0.74:8443: connect: connection refused" start-of-body= Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.295518 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": dial tcp 10.217.0.74:8443: connect: connection refused" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.297287 4898 generic.go:334] "Generic (PLEG): container finished" podID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" containerID="f59d39668fd5f01a9f140fcfa1fd92ce8c2ea3f200013f1b1e08b5b0e679790d" exitCode=1 Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.297333 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" event={"ID":"d29ce3ee-3d5a-4801-abf9-dfef5b641a74","Type":"ContainerDied","Data":"f59d39668fd5f01a9f140fcfa1fd92ce8c2ea3f200013f1b1e08b5b0e679790d"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.298383 4898 scope.go:117] "RemoveContainer" containerID="f59d39668fd5f01a9f140fcfa1fd92ce8c2ea3f200013f1b1e08b5b0e679790d" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.304492 4898 generic.go:334] "Generic (PLEG): container finished" podID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerID="f6246f6fbdbf725301f1e7d37222d66624c9ce34cacd75b24a3bb142ed798d64" exitCode=0 Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.304550 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" event={"ID":"7d27543e-df10-41f7-be85-dfe319aaec8a","Type":"ContainerDied","Data":"f6246f6fbdbf725301f1e7d37222d66624c9ce34cacd75b24a3bb142ed798d64"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.311164 4898 generic.go:334] "Generic (PLEG): container finished" podID="6f12557e-02f5-4445-988f-b19f16672e3b" containerID="04b36de235cccfc39b10d95f29d1b8eb5397d41b37472245250afa44100523ba" exitCode=0 Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.311245 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" event={"ID":"6f12557e-02f5-4445-988f-b19f16672e3b","Type":"ContainerDied","Data":"04b36de235cccfc39b10d95f29d1b8eb5397d41b37472245250afa44100523ba"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.320711 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/2.log" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.322782 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.323607 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2f1658da3ae2f145888c3de5fc2918adc223e0f1999ca74ae5bb585f272622e4"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.330601 4898 generic.go:334] "Generic (PLEG): container finished" podID="02f7d483-aecb-4a39-babc-6d9598090c4b" containerID="7ce3438fd9d4e3db5da0a65bc2744dcbc537f4c6c98b27b1433e8ed964fd1ed3" exitCode=0 Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.330660 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02f7d483-aecb-4a39-babc-6d9598090c4b","Type":"ContainerDied","Data":"7ce3438fd9d4e3db5da0a65bc2744dcbc537f4c6c98b27b1433e8ed964fd1ed3"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.334963 4898 generic.go:334] "Generic (PLEG): container finished" podID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerID="533f9c0d5d010c2b5018f71aab66fa3025d2e9a7d36d33fe0f8634b0b2bbb7ec" exitCode=0 Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.335025 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" event={"ID":"c8b0b1cf-022c-4181-a957-2f7e172a3294","Type":"ContainerDied","Data":"533f9c0d5d010c2b5018f71aab66fa3025d2e9a7d36d33fe0f8634b0b2bbb7ec"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.338023 4898 generic.go:334] "Generic (PLEG): container finished" podID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" containerID="d2bed079a6a5a2e515d93d89a9a982d254f51045181d7bcb6277083a331b61e4" exitCode=1 Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.338579 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" event={"ID":"a80d01d5-0201-4b2e-974c-ac5b42ac8df4","Type":"ContainerDied","Data":"d2bed079a6a5a2e515d93d89a9a982d254f51045181d7bcb6277083a331b61e4"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.339552 4898 scope.go:117] "RemoveContainer" containerID="d2bed079a6a5a2e515d93d89a9a982d254f51045181d7bcb6277083a331b61e4" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.508265 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556916-dvhfq"] Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.881459 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.881793 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.170579 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jndb5" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:24 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:24 crc kubenswrapper[4898]: > Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.351378 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" event={"ID":"a80d01d5-0201-4b2e-974c-ac5b42ac8df4","Type":"ContainerStarted","Data":"57dd0f00abea15d0102c1f6686fe41b17154ca27e77edac4d3f07ccf4e0e9a30"} Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.351677 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.358673 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" event={"ID":"7d27543e-df10-41f7-be85-dfe319aaec8a","Type":"ContainerStarted","Data":"97f9af41b669994e028cd2913281b09704a6416bf06091e5bbc3ac6ac3674dc6"} Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.359421 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.361485 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qtdtw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.361560 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.373365 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" event={"ID":"e000d86e-e7a8-49ed-9184-fdd67dfe797d","Type":"ContainerStarted","Data":"2cf4d6c27dfc0f01d733dd2ec60bf38c3ede882f93d64c0e31fc8677931d4396"} Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.374195 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.377384 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" event={"ID":"6f12557e-02f5-4445-988f-b19f16672e3b","Type":"ContainerStarted","Data":"b31d17031b09e08e43b60bd3c1e2de112528a6f18a6b414ec43a7e632a1eb2a8"} Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.381008 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" event={"ID":"c8b0b1cf-022c-4181-a957-2f7e172a3294","Type":"ContainerStarted","Data":"e071b990987923ab86abcea4120d15a5572916972d1bc0013951d7f1eb0b37a1"} Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.381370 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.381558 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.381603 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.383043 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" event={"ID":"0ab852e1-fd26-4f76-b758-77896f8e236b","Type":"ContainerStarted","Data":"7b52a84dc05610996a0e88a7808021878a6fff6c922305317623864c3cc9491b"} Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.383873 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.390195 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" event={"ID":"d29ce3ee-3d5a-4801-abf9-dfef5b641a74","Type":"ContainerStarted","Data":"068eab642ca7d18702fbe49598e8364e743f264af5b01ef0310ed4c5d5f66912"} Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.390753 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": dial tcp 10.217.0.74:8443: connect: connection refused" start-of-body= Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.390798 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": dial tcp 10.217.0.74:8443: connect: connection refused" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.603146 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" containerID="cri-o://5603fe1e36b001c885a705c95d9e330d44a889a46589d0677416ea907bf21af1" gracePeriod=27 Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.617280 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" containerID="cri-o://754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c" gracePeriod=26 Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.084055 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="a9a7064c-4ed5-4948-9e7e-7d40794e371e" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.400378 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" event={"ID":"3bfc0332-bb59-42bf-bb70-462efa225c81","Type":"ContainerStarted","Data":"d02edb2f50df8c11d1dd354b0da5f4dcf8478960efc66d23360e9ccc032a7106"} Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.400756 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.400888 4898 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-ljrtz container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.16:8081/healthz\": dial tcp 10.217.0.16:8081: connect: connection refused" start-of-body= Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.400960 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.16:8081/healthz\": dial tcp 10.217.0.16:8081: connect: connection refused" Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.402836 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": dial tcp 10.217.0.74:8443: connect: connection refused" start-of-body= Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.402880 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": dial tcp 10.217.0.74:8443: connect: connection refused" Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.404101 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02f7d483-aecb-4a39-babc-6d9598090c4b","Type":"ContainerStarted","Data":"b895aa795c06776bb698b15151bf2914299b97db6c949697d18679898bb8b085"} Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.404591 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.405644 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.405656 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qtdtw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.405697 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.405697 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 13 15:16:26 crc kubenswrapper[4898]: I0313 15:16:26.421474 4898 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-ljrtz container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.16:8081/healthz\": dial tcp 10.217.0.16:8081: connect: connection refused" start-of-body= Mar 13 15:16:26 crc kubenswrapper[4898]: I0313 15:16:26.422317 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.16:8081/healthz\": dial tcp 10.217.0.16:8081: connect: connection refused" Mar 13 15:16:26 crc kubenswrapper[4898]: I0313 15:16:26.627631 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-bqmxg" Mar 13 15:16:27 crc kubenswrapper[4898]: I0313 15:16:27.435565 4898 generic.go:334] "Generic (PLEG): container finished" podID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerID="5603fe1e36b001c885a705c95d9e330d44a889a46589d0677416ea907bf21af1" exitCode=0 Mar 13 15:16:27 crc kubenswrapper[4898]: I0313 15:16:27.435681 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f","Type":"ContainerDied","Data":"5603fe1e36b001c885a705c95d9e330d44a889a46589d0677416ea907bf21af1"} Mar 13 15:16:27 crc kubenswrapper[4898]: I0313 15:16:27.788698 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="a9a7064c-4ed5-4948-9e7e-7d40794e371e" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:28 crc kubenswrapper[4898]: I0313 15:16:28.347701 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:16:28 crc kubenswrapper[4898]: I0313 15:16:28.462730 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f","Type":"ContainerStarted","Data":"df908f851dea57055c3789e36a48d0a1ca0c8b55e694ae40d9f47ed21423202a"} Mar 13 15:16:28 crc kubenswrapper[4898]: I0313 15:16:28.474457 4898 generic.go:334] "Generic (PLEG): container finished" podID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerID="754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c" exitCode=0 Mar 13 15:16:28 crc kubenswrapper[4898]: I0313 15:16:28.474503 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e5d53cf3-113e-4391-b3a9-4e1f81836e26","Type":"ContainerDied","Data":"754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c"} Mar 13 15:16:28 crc kubenswrapper[4898]: I0313 15:16:28.506989 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556916-dvhfq"] Mar 13 15:16:28 crc kubenswrapper[4898]: W0313 15:16:28.570537 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb6b061a_b0db_4b84_bfc7_08238f699132.slice/crio-7309bc6a6f15fd15c242b030d405b5478267924b192042e934072170e618ae49 WatchSource:0}: Error finding container 7309bc6a6f15fd15c242b030d405b5478267924b192042e934072170e618ae49: Status 404 returned error can't find the container with id 7309bc6a6f15fd15c242b030d405b5478267924b192042e934072170e618ae49 Mar 13 15:16:28 crc kubenswrapper[4898]: E0313 15:16:28.618977 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c is running failed: container process not found" containerID="754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 13 15:16:28 crc kubenswrapper[4898]: E0313 15:16:28.620357 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c is running failed: container process not found" containerID="754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 13 15:16:28 crc kubenswrapper[4898]: E0313 15:16:28.620724 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c is running failed: container process not found" containerID="754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 13 15:16:28 crc kubenswrapper[4898]: E0313 15:16:28.620784 4898 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.231227 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.231948 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.342505 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.451926 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.494186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" event={"ID":"bb6b061a-b0db-4b84-bfc7-08238f699132","Type":"ContainerStarted","Data":"7309bc6a6f15fd15c242b030d405b5478267924b192042e934072170e618ae49"} Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.511406 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e5d53cf3-113e-4391-b3a9-4e1f81836e26","Type":"ContainerStarted","Data":"4d25705f3ea5640e3bd4269670bc56e2661a3c0a18a96ecdd41b77899dbed12d"} Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.555116 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.624313 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]backend-http ok Mar 13 15:16:29 crc kubenswrapper[4898]: [+]has-synced ok Mar 13 15:16:29 crc kubenswrapper[4898]: [-]process-running failed: reason withheld Mar 13 15:16:29 crc kubenswrapper[4898]: healthz check failed Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.624374 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.668176 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.846521 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.846580 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.409294 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.436962 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.445401 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.524425 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" event={"ID":"bb6b061a-b0db-4b84-bfc7-08238f699132","Type":"ContainerStarted","Data":"bf083e1d2202ec3f40b443b0422cd0440a225764d3bb5e0e49d48d4861f197f0"} Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.528440 4898 generic.go:334] "Generic (PLEG): container finished" podID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerID="aebe307d7887ba7796f4fad329402cc88b0d4332f82fab0c321325f23bf1adea" exitCode=0 Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.528524 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnh48" event={"ID":"cb46c8b0-a6a9-4b6d-86a1-8408793887e5","Type":"ContainerDied","Data":"aebe307d7887ba7796f4fad329402cc88b0d4332f82fab0c321325f23bf1adea"} Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.550165 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" podStartSLOduration=21.695304056 podStartE2EDuration="22.550145652s" podCreationTimestamp="2026-03-13 15:16:08 +0000 UTC" firstStartedPulling="2026-03-13 15:16:28.579979315 +0000 UTC m=+4823.581567554" lastFinishedPulling="2026-03-13 15:16:29.434820921 +0000 UTC m=+4824.436409150" observedRunningTime="2026-03-13 15:16:30.542332455 +0000 UTC m=+4825.543920704" watchObservedRunningTime="2026-03-13 15:16:30.550145652 +0000 UTC m=+4825.551733891" Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.623265 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="a9a7064c-4ed5-4948-9e7e-7d40794e371e" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.623400 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.624686 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"6b88f7ff52506e35b8cbb7e231ccb1285ae276c1a9d69c02b14ca5282f74157d"} pod="openstack/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.624749 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a9a7064c-4ed5-4948-9e7e-7d40794e371e" containerName="cinder-scheduler" containerID="cri-o://6b88f7ff52506e35b8cbb7e231ccb1285ae276c1a9d69c02b14ca5282f74157d" gracePeriod=30 Mar 13 15:16:31 crc kubenswrapper[4898]: I0313 15:16:31.232079 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jndb5" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:31 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:31 crc kubenswrapper[4898]: > Mar 13 15:16:31 crc kubenswrapper[4898]: I0313 15:16:31.561056 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5444994796-6plhg_18e5c8bf-9fe0-465e-af8f-9e7ec7400be8/router/0.log" Mar 13 15:16:31 crc kubenswrapper[4898]: I0313 15:16:31.561426 4898 generic.go:334] "Generic (PLEG): container finished" podID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerID="c62d0b0db8a023eca0369719b9dd81ab2eadb339e637fa7223f0ac36593bb07b" exitCode=137 Mar 13 15:16:31 crc kubenswrapper[4898]: I0313 15:16:31.561528 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6plhg" event={"ID":"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8","Type":"ContainerDied","Data":"c62d0b0db8a023eca0369719b9dd81ab2eadb339e637fa7223f0ac36593bb07b"} Mar 13 15:16:31 crc kubenswrapper[4898]: I0313 15:16:31.561624 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6plhg" event={"ID":"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8","Type":"ContainerStarted","Data":"11acc6c7309a70dd02a02a8791913b33319c75a2620dd25f8f262fb055feaf2d"} Mar 13 15:16:31 crc kubenswrapper[4898]: I0313 15:16:31.610710 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 15:16:31 crc kubenswrapper[4898]: I0313 15:16:31.612240 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 13 15:16:31 crc kubenswrapper[4898]: I0313 15:16:31.612293 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 13 15:16:32 crc kubenswrapper[4898]: I0313 15:16:32.574754 4898 generic.go:334] "Generic (PLEG): container finished" podID="bb6b061a-b0db-4b84-bfc7-08238f699132" containerID="bf083e1d2202ec3f40b443b0422cd0440a225764d3bb5e0e49d48d4861f197f0" exitCode=0 Mar 13 15:16:32 crc kubenswrapper[4898]: I0313 15:16:32.574826 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" event={"ID":"bb6b061a-b0db-4b84-bfc7-08238f699132","Type":"ContainerDied","Data":"bf083e1d2202ec3f40b443b0422cd0440a225764d3bb5e0e49d48d4861f197f0"} Mar 13 15:16:32 crc kubenswrapper[4898]: I0313 15:16:32.616659 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.210015 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.500235 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.617841 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnh48" event={"ID":"cb46c8b0-a6a9-4b6d-86a1-8408793887e5","Type":"ContainerStarted","Data":"e863d80604f47eb74bf43cf0b02ad573ba531995e209d5a2d54eb9c4f9740dde"} Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.618635 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.623131 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.647148 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dnh48" podStartSLOduration=6.05816232 podStartE2EDuration="1m11.647128771s" podCreationTimestamp="2026-03-13 15:15:22 +0000 UTC" firstStartedPulling="2026-03-13 15:15:26.668082389 +0000 UTC m=+4761.669670628" lastFinishedPulling="2026-03-13 15:16:32.25704884 +0000 UTC m=+4827.258637079" observedRunningTime="2026-03-13 15:16:33.64178903 +0000 UTC m=+4828.643377289" watchObservedRunningTime="2026-03-13 15:16:33.647128771 +0000 UTC m=+4828.648717010" Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.882753 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.882865 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.885110 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.029000 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.490540 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.637803 4898 generic.go:334] "Generic (PLEG): container finished" podID="a9a7064c-4ed5-4948-9e7e-7d40794e371e" containerID="6b88f7ff52506e35b8cbb7e231ccb1285ae276c1a9d69c02b14ca5282f74157d" exitCode=0 Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.637921 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9a7064c-4ed5-4948-9e7e-7d40794e371e","Type":"ContainerDied","Data":"6b88f7ff52506e35b8cbb7e231ccb1285ae276c1a9d69c02b14ca5282f74157d"} Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.640206 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" event={"ID":"bb6b061a-b0db-4b84-bfc7-08238f699132","Type":"ContainerDied","Data":"7309bc6a6f15fd15c242b030d405b5478267924b192042e934072170e618ae49"} Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.640399 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.640770 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7309bc6a6f15fd15c242b030d405b5478267924b192042e934072170e618ae49" Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.658974 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6895\" (UniqueName: \"kubernetes.io/projected/bb6b061a-b0db-4b84-bfc7-08238f699132-kube-api-access-d6895\") pod \"bb6b061a-b0db-4b84-bfc7-08238f699132\" (UID: \"bb6b061a-b0db-4b84-bfc7-08238f699132\") " Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.684306 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6b061a-b0db-4b84-bfc7-08238f699132-kube-api-access-d6895" (OuterVolumeSpecName: "kube-api-access-d6895") pod "bb6b061a-b0db-4b84-bfc7-08238f699132" (UID: "bb6b061a-b0db-4b84-bfc7-08238f699132"). InnerVolumeSpecName "kube-api-access-d6895". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.767320 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6895\" (UniqueName: \"kubernetes.io/projected/bb6b061a-b0db-4b84-bfc7-08238f699132-kube-api-access-d6895\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.959546 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:34 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:34 crc kubenswrapper[4898]: > Mar 13 15:16:35 crc kubenswrapper[4898]: I0313 15:16:35.408741 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 15:16:36 crc kubenswrapper[4898]: I0313 15:16:36.669920 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9a7064c-4ed5-4948-9e7e-7d40794e371e","Type":"ContainerStarted","Data":"73adc53aef32c0108da19cf6e8c79fa9d99ea5ab3af2ff7e8bfab2d2ff3c28cc"} Mar 13 15:16:38 crc kubenswrapper[4898]: I0313 15:16:38.350872 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:16:38 crc kubenswrapper[4898]: I0313 15:16:38.618444 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 13 15:16:38 crc kubenswrapper[4898]: I0313 15:16:38.620155 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 13 15:16:39 crc kubenswrapper[4898]: I0313 15:16:39.608118 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 15:16:41 crc kubenswrapper[4898]: I0313 15:16:41.127457 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jndb5" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:41 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:41 crc kubenswrapper[4898]: > Mar 13 15:16:44 crc kubenswrapper[4898]: I0313 15:16:44.631763 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="a9a7064c-4ed5-4948-9e7e-7d40794e371e" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:45 crc kubenswrapper[4898]: I0313 15:16:45.372654 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:45 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:45 crc kubenswrapper[4898]: > Mar 13 15:16:47 crc kubenswrapper[4898]: I0313 15:16:47.070583 4898 scope.go:117] "RemoveContainer" containerID="ef58acf8d4ae6204788d3fdc59aecb3462e5ceb48474e990c3e92ca73408a0f8" Mar 13 15:16:47 crc kubenswrapper[4898]: I0313 15:16:47.125632 4898 scope.go:117] "RemoveContainer" containerID="26a0efcf86b49360d3ca0f6db51f8be8241064695e81797fdbbea93b417ae346" Mar 13 15:16:47 crc kubenswrapper[4898]: I0313 15:16:47.213445 4898 scope.go:117] "RemoveContainer" containerID="1131b0202620c67ed5c6fc2a5f10369017ac4f0c291aa0a69276d20706cb8627" Mar 13 15:16:49 crc kubenswrapper[4898]: I0313 15:16:49.133861 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:16:49 crc kubenswrapper[4898]: I0313 15:16:49.134197 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:16:49 crc kubenswrapper[4898]: I0313 15:16:49.627647 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="a9a7064c-4ed5-4948-9e7e-7d40794e371e" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:50 crc kubenswrapper[4898]: I0313 15:16:50.292053 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 13 15:16:50 crc kubenswrapper[4898]: I0313 15:16:50.513553 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 13 15:16:50 crc kubenswrapper[4898]: I0313 15:16:50.754948 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 13 15:16:50 crc kubenswrapper[4898]: I0313 15:16:50.998059 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 13 15:16:51 crc kubenswrapper[4898]: I0313 15:16:51.185658 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jndb5" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:51 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:51 crc kubenswrapper[4898]: > Mar 13 15:16:54 crc kubenswrapper[4898]: I0313 15:16:54.680623 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 15:16:54 crc kubenswrapper[4898]: I0313 15:16:54.805650 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 15:16:54 crc kubenswrapper[4898]: I0313 15:16:54.954070 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:54 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:54 crc kubenswrapper[4898]: > Mar 13 15:16:55 crc kubenswrapper[4898]: I0313 15:16:55.077489 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556910-lcc5d"] Mar 13 15:16:55 crc kubenswrapper[4898]: I0313 15:16:55.091814 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556910-lcc5d"] Mar 13 15:16:55 crc kubenswrapper[4898]: I0313 15:16:55.760875 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a36f55a-ce22-4339-967f-906f473ddad5" path="/var/lib/kubelet/pods/0a36f55a-ce22-4339-967f-906f473ddad5/volumes" Mar 13 15:16:55 crc kubenswrapper[4898]: I0313 15:16:55.934728 4898 generic.go:334] "Generic (PLEG): container finished" podID="d19e8770-f0c1-491e-96c9-f737386ab3b0" containerID="f6562a9a91d72757d77dbf107969b6ab33c4a3a7219f9b57d0df0ee10184af60" exitCode=1 Mar 13 15:16:55 crc kubenswrapper[4898]: I0313 15:16:55.934932 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d19e8770-f0c1-491e-96c9-f737386ab3b0","Type":"ContainerDied","Data":"f6562a9a91d72757d77dbf107969b6ab33c4a3a7219f9b57d0df0ee10184af60"} Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.829728 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.853927 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-workdir\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.853984 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ssh-key\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.854050 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ca-certs\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.854098 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.854329 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqc45\" (UniqueName: \"kubernetes.io/projected/d19e8770-f0c1-491e-96c9-f737386ab3b0-kube-api-access-mqc45\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.854371 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-config-data\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.854396 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-temporary\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.854454 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.854508 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config-secret\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.858458 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.858660 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-config-data" (OuterVolumeSpecName: "config-data") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.865704 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.874926 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "test-operator-logs") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.892681 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19e8770-f0c1-491e-96c9-f737386ab3b0-kube-api-access-mqc45" (OuterVolumeSpecName: "kube-api-access-mqc45") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "kube-api-access-mqc45". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.916947 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.941654 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.963164 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d19e8770-f0c1-491e-96c9-f737386ab3b0","Type":"ContainerDied","Data":"afe41cfa21ca0ff15752a7557715bbecbe54855edbe2061b3b072582d6fab3b3"} Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.963204 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afe41cfa21ca0ff15752a7557715bbecbe54855edbe2061b3b072582d6fab3b3" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.963246 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.994507 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.994538 4898 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.994548 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.999373 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.999413 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqc45\" (UniqueName: \"kubernetes.io/projected/d19e8770-f0c1-491e-96c9-f737386ab3b0-kube-api-access-mqc45\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.999430 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.999445 4898 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:58 crc kubenswrapper[4898]: I0313 15:16:58.002238 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:16:58 crc kubenswrapper[4898]: I0313 15:16:58.021107 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:16:58 crc kubenswrapper[4898]: I0313 15:16:58.034075 4898 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 13 15:16:58 crc kubenswrapper[4898]: I0313 15:16:58.102459 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:58 crc kubenswrapper[4898]: I0313 15:16:58.102492 4898 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:58 crc kubenswrapper[4898]: I0313 15:16:58.102502 4898 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.156423 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.222227 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.377910 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 13 15:17:00 crc kubenswrapper[4898]: E0313 15:17:00.379839 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19e8770-f0c1-491e-96c9-f737386ab3b0" containerName="tempest-tests-tempest-tests-runner" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.380491 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19e8770-f0c1-491e-96c9-f737386ab3b0" containerName="tempest-tests-tempest-tests-runner" Mar 13 15:17:00 crc kubenswrapper[4898]: E0313 15:17:00.380534 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6b061a-b0db-4b84-bfc7-08238f699132" containerName="oc" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.380543 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6b061a-b0db-4b84-bfc7-08238f699132" containerName="oc" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.383375 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d19e8770-f0c1-491e-96c9-f737386ab3b0" containerName="tempest-tests-tempest-tests-runner" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.383426 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb6b061a-b0db-4b84-bfc7-08238f699132" containerName="oc" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.387821 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.406403 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jndb5"] Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.448575 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8v5gs" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.461637 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8t4g\" (UniqueName: \"kubernetes.io/projected/382b2b09-8110-411f-9d86-53e73df67fe6-kube-api-access-t8t4g\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"382b2b09-8110-411f-9d86-53e73df67fe6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.462155 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"382b2b09-8110-411f-9d86-53e73df67fe6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.484860 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.565106 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8t4g\" (UniqueName: \"kubernetes.io/projected/382b2b09-8110-411f-9d86-53e73df67fe6-kube-api-access-t8t4g\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"382b2b09-8110-411f-9d86-53e73df67fe6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.565265 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"382b2b09-8110-411f-9d86-53e73df67fe6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.567158 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"382b2b09-8110-411f-9d86-53e73df67fe6\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.614034 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"382b2b09-8110-411f-9d86-53e73df67fe6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.614779 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8t4g\" (UniqueName: \"kubernetes.io/projected/382b2b09-8110-411f-9d86-53e73df67fe6-kube-api-access-t8t4g\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"382b2b09-8110-411f-9d86-53e73df67fe6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.756323 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:02 crc kubenswrapper[4898]: I0313 15:17:02.019459 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jndb5" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" containerID="cri-o://d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd" gracePeriod=2 Mar 13 15:17:02 crc kubenswrapper[4898]: I0313 15:17:02.245134 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 13 15:17:02 crc kubenswrapper[4898]: I0313 15:17:02.905686 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:17:02 crc kubenswrapper[4898]: I0313 15:17:02.931225 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-catalog-content\") pod \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " Mar 13 15:17:02 crc kubenswrapper[4898]: I0313 15:17:02.931522 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-utilities\") pod \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " Mar 13 15:17:02 crc kubenswrapper[4898]: I0313 15:17:02.931553 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jdgf\" (UniqueName: \"kubernetes.io/projected/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-kube-api-access-9jdgf\") pod \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " Mar 13 15:17:02 crc kubenswrapper[4898]: I0313 15:17:02.932179 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-utilities" (OuterVolumeSpecName: "utilities") pod "2157e8bf-88a5-4e48-b621-1744dcf0fcdb" (UID: "2157e8bf-88a5-4e48-b621-1744dcf0fcdb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:17:02 crc kubenswrapper[4898]: I0313 15:17:02.938886 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-kube-api-access-9jdgf" (OuterVolumeSpecName: "kube-api-access-9jdgf") pod "2157e8bf-88a5-4e48-b621-1744dcf0fcdb" (UID: "2157e8bf-88a5-4e48-b621-1744dcf0fcdb"). InnerVolumeSpecName "kube-api-access-9jdgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.033891 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.033941 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jdgf\" (UniqueName: \"kubernetes.io/projected/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-kube-api-access-9jdgf\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.035887 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"382b2b09-8110-411f-9d86-53e73df67fe6","Type":"ContainerStarted","Data":"0957c13dd699676e302e317d3b318e4868bbf1a229575490427f1b196e9804a6"} Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.038974 4898 generic.go:334] "Generic (PLEG): container finished" podID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerID="d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd" exitCode=0 Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.039010 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jndb5" event={"ID":"2157e8bf-88a5-4e48-b621-1744dcf0fcdb","Type":"ContainerDied","Data":"d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd"} Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.039036 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jndb5" event={"ID":"2157e8bf-88a5-4e48-b621-1744dcf0fcdb","Type":"ContainerDied","Data":"ad1f1863b9e188c98c1dfb8a88d0df8d02f680dd682644703708970a9e0dc172"} Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.039055 4898 scope.go:117] "RemoveContainer" containerID="d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.039153 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.066597 4898 scope.go:117] "RemoveContainer" containerID="3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.077358 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2157e8bf-88a5-4e48-b621-1744dcf0fcdb" (UID: "2157e8bf-88a5-4e48-b621-1744dcf0fcdb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.104316 4898 scope.go:117] "RemoveContainer" containerID="8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.136664 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.141984 4898 scope.go:117] "RemoveContainer" containerID="d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd" Mar 13 15:17:03 crc kubenswrapper[4898]: E0313 15:17:03.142467 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd\": container with ID starting with d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd not found: ID does not exist" containerID="d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.142504 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd"} err="failed to get container status \"d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd\": rpc error: code = NotFound desc = could not find container \"d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd\": container with ID starting with d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd not found: ID does not exist" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.142529 4898 scope.go:117] "RemoveContainer" containerID="3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43" Mar 13 15:17:03 crc kubenswrapper[4898]: E0313 15:17:03.142840 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43\": container with ID starting with 3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43 not found: ID does not exist" containerID="3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.142879 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43"} err="failed to get container status \"3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43\": rpc error: code = NotFound desc = could not find container \"3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43\": container with ID starting with 3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43 not found: ID does not exist" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.142921 4898 scope.go:117] "RemoveContainer" containerID="8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22" Mar 13 15:17:03 crc kubenswrapper[4898]: E0313 15:17:03.143160 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22\": container with ID starting with 8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22 not found: ID does not exist" containerID="8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.143182 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22"} err="failed to get container status \"8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22\": rpc error: code = NotFound desc = could not find container \"8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22\": container with ID starting with 8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22 not found: ID does not exist" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.401978 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jndb5"] Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.411889 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jndb5"] Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.757457 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" path="/var/lib/kubelet/pods/2157e8bf-88a5-4e48-b621-1744dcf0fcdb/volumes" Mar 13 15:17:04 crc kubenswrapper[4898]: I0313 15:17:04.939728 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:17:04 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:17:04 crc kubenswrapper[4898]: > Mar 13 15:17:09 crc kubenswrapper[4898]: I0313 15:17:09.127044 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"382b2b09-8110-411f-9d86-53e73df67fe6","Type":"ContainerStarted","Data":"9cb6f19deab8d2857d19039413245405ed7b87fda2a496b02c59f548e5f3e635"} Mar 13 15:17:09 crc kubenswrapper[4898]: I0313 15:17:09.144661 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=4.326225526 podStartE2EDuration="9.144642843s" podCreationTimestamp="2026-03-13 15:17:00 +0000 UTC" firstStartedPulling="2026-03-13 15:17:02.277809149 +0000 UTC m=+4857.279397388" lastFinishedPulling="2026-03-13 15:17:07.096226466 +0000 UTC m=+4862.097814705" observedRunningTime="2026-03-13 15:17:09.1409627 +0000 UTC m=+4864.142550959" watchObservedRunningTime="2026-03-13 15:17:09.144642843 +0000 UTC m=+4864.146231082" Mar 13 15:17:14 crc kubenswrapper[4898]: I0313 15:17:14.942921 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:17:14 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:17:14 crc kubenswrapper[4898]: > Mar 13 15:17:19 crc kubenswrapper[4898]: I0313 15:17:19.133876 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:17:19 crc kubenswrapper[4898]: I0313 15:17:19.134496 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:17:19 crc kubenswrapper[4898]: I0313 15:17:19.134542 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 15:17:19 crc kubenswrapper[4898]: I0313 15:17:19.135544 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf9fa5bb76f8bd5a010d026caf62189a87b342669ddb0345c62f785750fd30c1"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:17:19 crc kubenswrapper[4898]: I0313 15:17:19.135592 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://bf9fa5bb76f8bd5a010d026caf62189a87b342669ddb0345c62f785750fd30c1" gracePeriod=600 Mar 13 15:17:20 crc kubenswrapper[4898]: I0313 15:17:20.249745 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="bf9fa5bb76f8bd5a010d026caf62189a87b342669ddb0345c62f785750fd30c1" exitCode=0 Mar 13 15:17:20 crc kubenswrapper[4898]: I0313 15:17:20.249825 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"bf9fa5bb76f8bd5a010d026caf62189a87b342669ddb0345c62f785750fd30c1"} Mar 13 15:17:20 crc kubenswrapper[4898]: I0313 15:17:20.250403 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668"} Mar 13 15:17:20 crc kubenswrapper[4898]: I0313 15:17:20.250429 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:17:24 crc kubenswrapper[4898]: I0313 15:17:24.948303 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:17:24 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:17:24 crc kubenswrapper[4898]: > Mar 13 15:17:34 crc kubenswrapper[4898]: I0313 15:17:34.935302 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:17:34 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:17:34 crc kubenswrapper[4898]: > Mar 13 15:17:40 crc kubenswrapper[4898]: I0313 15:17:40.832462 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:17:40 crc kubenswrapper[4898]: I0313 15:17:40.833192 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:17:44 crc kubenswrapper[4898]: I0313 15:17:44.934109 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:17:44 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:17:44 crc kubenswrapper[4898]: > Mar 13 15:17:47 crc kubenswrapper[4898]: I0313 15:17:47.565664 4898 scope.go:117] "RemoveContainer" containerID="ba349dae26dd37d5b178e79f9ed4076346dab25c90569fb520b86b35b588e387" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.669051 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b6mrl/must-gather-cklv9"] Mar 13 15:17:49 crc kubenswrapper[4898]: E0313 15:17:49.673209 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="extract-content" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.673236 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="extract-content" Mar 13 15:17:49 crc kubenswrapper[4898]: E0313 15:17:49.673280 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="extract-utilities" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.673287 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="extract-utilities" Mar 13 15:17:49 crc kubenswrapper[4898]: E0313 15:17:49.673304 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.673311 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.673708 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.675040 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.677273 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-b6mrl"/"default-dockercfg-bppfx" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.683706 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-b6mrl"/"openshift-service-ca.crt" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.686585 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-b6mrl"/"kube-root-ca.crt" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.697658 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b6mrl/must-gather-cklv9"] Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.761148 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfhwg\" (UniqueName: \"kubernetes.io/projected/9ef69d80-7edf-459b-a521-b45bc90a18df-kube-api-access-cfhwg\") pod \"must-gather-cklv9\" (UID: \"9ef69d80-7edf-459b-a521-b45bc90a18df\") " pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.761259 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ef69d80-7edf-459b-a521-b45bc90a18df-must-gather-output\") pod \"must-gather-cklv9\" (UID: \"9ef69d80-7edf-459b-a521-b45bc90a18df\") " pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.863600 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfhwg\" (UniqueName: \"kubernetes.io/projected/9ef69d80-7edf-459b-a521-b45bc90a18df-kube-api-access-cfhwg\") pod \"must-gather-cklv9\" (UID: \"9ef69d80-7edf-459b-a521-b45bc90a18df\") " pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.863718 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ef69d80-7edf-459b-a521-b45bc90a18df-must-gather-output\") pod \"must-gather-cklv9\" (UID: \"9ef69d80-7edf-459b-a521-b45bc90a18df\") " pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.864840 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ef69d80-7edf-459b-a521-b45bc90a18df-must-gather-output\") pod \"must-gather-cklv9\" (UID: \"9ef69d80-7edf-459b-a521-b45bc90a18df\") " pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.892067 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfhwg\" (UniqueName: \"kubernetes.io/projected/9ef69d80-7edf-459b-a521-b45bc90a18df-kube-api-access-cfhwg\") pod \"must-gather-cklv9\" (UID: \"9ef69d80-7edf-459b-a521-b45bc90a18df\") " pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:17:50 crc kubenswrapper[4898]: I0313 15:17:50.008536 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:17:51 crc kubenswrapper[4898]: I0313 15:17:51.041036 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b6mrl/must-gather-cklv9"] Mar 13 15:17:51 crc kubenswrapper[4898]: I0313 15:17:51.644287 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/must-gather-cklv9" event={"ID":"9ef69d80-7edf-459b-a521-b45bc90a18df","Type":"ContainerStarted","Data":"197380797ad59a26356f5d68d34c739e92ec17b270095e2411ef7c9eee557eaa"} Mar 13 15:17:54 crc kubenswrapper[4898]: I0313 15:17:54.034007 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:17:54 crc kubenswrapper[4898]: I0313 15:17:54.103444 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:17:54 crc kubenswrapper[4898]: I0313 15:17:54.286878 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dnh48"] Mar 13 15:17:55 crc kubenswrapper[4898]: I0313 15:17:55.695564 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" containerID="cri-o://e863d80604f47eb74bf43cf0b02ad573ba531995e209d5a2d54eb9c4f9740dde" gracePeriod=2 Mar 13 15:17:56 crc kubenswrapper[4898]: I0313 15:17:56.720009 4898 generic.go:334] "Generic (PLEG): container finished" podID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerID="e863d80604f47eb74bf43cf0b02ad573ba531995e209d5a2d54eb9c4f9740dde" exitCode=0 Mar 13 15:17:56 crc kubenswrapper[4898]: I0313 15:17:56.720077 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnh48" event={"ID":"cb46c8b0-a6a9-4b6d-86a1-8408793887e5","Type":"ContainerDied","Data":"e863d80604f47eb74bf43cf0b02ad573ba531995e209d5a2d54eb9c4f9740dde"} Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.221533 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556918-gcw8c"] Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.225175 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.228982 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.229455 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.229599 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.241547 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556918-gcw8c"] Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.393257 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr25w\" (UniqueName: \"kubernetes.io/projected/85e19347-9341-49c0-9195-97e383796cb3-kube-api-access-lr25w\") pod \"auto-csr-approver-29556918-gcw8c\" (UID: \"85e19347-9341-49c0-9195-97e383796cb3\") " pod="openshift-infra/auto-csr-approver-29556918-gcw8c" Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.496108 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr25w\" (UniqueName: \"kubernetes.io/projected/85e19347-9341-49c0-9195-97e383796cb3-kube-api-access-lr25w\") pod \"auto-csr-approver-29556918-gcw8c\" (UID: \"85e19347-9341-49c0-9195-97e383796cb3\") " pod="openshift-infra/auto-csr-approver-29556918-gcw8c" Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.522045 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr25w\" (UniqueName: \"kubernetes.io/projected/85e19347-9341-49c0-9195-97e383796cb3-kube-api-access-lr25w\") pod \"auto-csr-approver-29556918-gcw8c\" (UID: \"85e19347-9341-49c0-9195-97e383796cb3\") " pod="openshift-infra/auto-csr-approver-29556918-gcw8c" Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.550573 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.624655 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:18:01 crc kubenswrapper[4898]: W0313 15:18:01.703235 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85e19347_9341_49c0_9195_97e383796cb3.slice/crio-6cd199ad916b31e7f29a9349f409a7ea25e42056bbd3d20e1aa0b88f98b05ced WatchSource:0}: Error finding container 6cd199ad916b31e7f29a9349f409a7ea25e42056bbd3d20e1aa0b88f98b05ced: Status 404 returned error can't find the container with id 6cd199ad916b31e7f29a9349f409a7ea25e42056bbd3d20e1aa0b88f98b05ced Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.723404 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnffg\" (UniqueName: \"kubernetes.io/projected/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-kube-api-access-mnffg\") pod \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.724215 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-utilities\") pod \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.724261 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-catalog-content\") pod \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.724881 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-utilities" (OuterVolumeSpecName: "utilities") pod "cb46c8b0-a6a9-4b6d-86a1-8408793887e5" (UID: "cb46c8b0-a6a9-4b6d-86a1-8408793887e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.725254 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.730692 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556918-gcw8c"] Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.731257 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-kube-api-access-mnffg" (OuterVolumeSpecName: "kube-api-access-mnffg") pod "cb46c8b0-a6a9-4b6d-86a1-8408793887e5" (UID: "cb46c8b0-a6a9-4b6d-86a1-8408793887e5"). InnerVolumeSpecName "kube-api-access-mnffg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.817513 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/must-gather-cklv9" event={"ID":"9ef69d80-7edf-459b-a521-b45bc90a18df","Type":"ContainerStarted","Data":"0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb"} Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.817670 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/must-gather-cklv9" event={"ID":"9ef69d80-7edf-459b-a521-b45bc90a18df","Type":"ContainerStarted","Data":"2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83"} Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.828275 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.828210 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnh48" event={"ID":"cb46c8b0-a6a9-4b6d-86a1-8408793887e5","Type":"ContainerDied","Data":"ef80ef6757d050f773b7a3c8ba863e9ba495da2f0964e3cb0f243834ede62c6d"} Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.832451 4898 scope.go:117] "RemoveContainer" containerID="e863d80604f47eb74bf43cf0b02ad573ba531995e209d5a2d54eb9c4f9740dde" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.833313 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b6mrl/must-gather-cklv9" podStartSLOduration=2.805858121 podStartE2EDuration="12.833296111s" podCreationTimestamp="2026-03-13 15:17:49 +0000 UTC" firstStartedPulling="2026-03-13 15:17:51.084394649 +0000 UTC m=+4906.085982888" lastFinishedPulling="2026-03-13 15:18:01.111832629 +0000 UTC m=+4916.113420878" observedRunningTime="2026-03-13 15:18:01.821687217 +0000 UTC m=+4916.823275466" watchObservedRunningTime="2026-03-13 15:18:01.833296111 +0000 UTC m=+4916.834884350" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.835286 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" event={"ID":"85e19347-9341-49c0-9195-97e383796cb3","Type":"ContainerStarted","Data":"6cd199ad916b31e7f29a9349f409a7ea25e42056bbd3d20e1aa0b88f98b05ced"} Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.842236 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnffg\" (UniqueName: \"kubernetes.io/projected/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-kube-api-access-mnffg\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.865228 4898 scope.go:117] "RemoveContainer" containerID="aebe307d7887ba7796f4fad329402cc88b0d4332f82fab0c321325f23bf1adea" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.893288 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb46c8b0-a6a9-4b6d-86a1-8408793887e5" (UID: "cb46c8b0-a6a9-4b6d-86a1-8408793887e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.900128 4898 scope.go:117] "RemoveContainer" containerID="044e1dbede2644956e1b9f4f9606342f16b9eba6e865673613710b9380c20e93" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.950186 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:02 crc kubenswrapper[4898]: I0313 15:18:02.171207 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dnh48"] Mar 13 15:18:02 crc kubenswrapper[4898]: I0313 15:18:02.181736 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dnh48"] Mar 13 15:18:03 crc kubenswrapper[4898]: I0313 15:18:03.757011 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" path="/var/lib/kubelet/pods/cb46c8b0-a6a9-4b6d-86a1-8408793887e5/volumes" Mar 13 15:18:04 crc kubenswrapper[4898]: I0313 15:18:04.872867 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" event={"ID":"85e19347-9341-49c0-9195-97e383796cb3","Type":"ContainerStarted","Data":"41c4a045c361afa3d6c1c9f58f41c2c132e20bbf6ef35d1cfbb029e2412c57e3"} Mar 13 15:18:04 crc kubenswrapper[4898]: I0313 15:18:04.895785 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" podStartSLOduration=4.092667867 podStartE2EDuration="4.895763725s" podCreationTimestamp="2026-03-13 15:18:00 +0000 UTC" firstStartedPulling="2026-03-13 15:18:01.706413295 +0000 UTC m=+4916.708001534" lastFinishedPulling="2026-03-13 15:18:02.509509153 +0000 UTC m=+4917.511097392" observedRunningTime="2026-03-13 15:18:04.886071184 +0000 UTC m=+4919.887659443" watchObservedRunningTime="2026-03-13 15:18:04.895763725 +0000 UTC m=+4919.897351974" Mar 13 15:18:05 crc kubenswrapper[4898]: I0313 15:18:05.888943 4898 generic.go:334] "Generic (PLEG): container finished" podID="85e19347-9341-49c0-9195-97e383796cb3" containerID="41c4a045c361afa3d6c1c9f58f41c2c132e20bbf6ef35d1cfbb029e2412c57e3" exitCode=0 Mar 13 15:18:05 crc kubenswrapper[4898]: I0313 15:18:05.889031 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" event={"ID":"85e19347-9341-49c0-9195-97e383796cb3","Type":"ContainerDied","Data":"41c4a045c361afa3d6c1c9f58f41c2c132e20bbf6ef35d1cfbb029e2412c57e3"} Mar 13 15:18:07 crc kubenswrapper[4898]: E0313 15:18:07.287520 4898 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:53544->38.102.83.201:43395: write tcp 38.102.83.201:53544->38.102.83.201:43395: write: broken pipe Mar 13 15:18:07 crc kubenswrapper[4898]: I0313 15:18:07.329531 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" Mar 13 15:18:07 crc kubenswrapper[4898]: I0313 15:18:07.352139 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr25w\" (UniqueName: \"kubernetes.io/projected/85e19347-9341-49c0-9195-97e383796cb3-kube-api-access-lr25w\") pod \"85e19347-9341-49c0-9195-97e383796cb3\" (UID: \"85e19347-9341-49c0-9195-97e383796cb3\") " Mar 13 15:18:07 crc kubenswrapper[4898]: I0313 15:18:07.358393 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e19347-9341-49c0-9195-97e383796cb3-kube-api-access-lr25w" (OuterVolumeSpecName: "kube-api-access-lr25w") pod "85e19347-9341-49c0-9195-97e383796cb3" (UID: "85e19347-9341-49c0-9195-97e383796cb3"). InnerVolumeSpecName "kube-api-access-lr25w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:18:07 crc kubenswrapper[4898]: I0313 15:18:07.455371 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr25w\" (UniqueName: \"kubernetes.io/projected/85e19347-9341-49c0-9195-97e383796cb3-kube-api-access-lr25w\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:07 crc kubenswrapper[4898]: I0313 15:18:07.914844 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" event={"ID":"85e19347-9341-49c0-9195-97e383796cb3","Type":"ContainerDied","Data":"6cd199ad916b31e7f29a9349f409a7ea25e42056bbd3d20e1aa0b88f98b05ced"} Mar 13 15:18:07 crc kubenswrapper[4898]: I0313 15:18:07.914892 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cd199ad916b31e7f29a9349f409a7ea25e42056bbd3d20e1aa0b88f98b05ced" Mar 13 15:18:07 crc kubenswrapper[4898]: I0313 15:18:07.914968 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" Mar 13 15:18:07 crc kubenswrapper[4898]: I0313 15:18:07.968838 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556912-5fmmb"] Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.005232 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556912-5fmmb"] Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.102992 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-nj5tr"] Mar 13 15:18:08 crc kubenswrapper[4898]: E0313 15:18:08.103743 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="extract-content" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.103768 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="extract-content" Mar 13 15:18:08 crc kubenswrapper[4898]: E0313 15:18:08.103805 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="extract-utilities" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.103814 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="extract-utilities" Mar 13 15:18:08 crc kubenswrapper[4898]: E0313 15:18:08.103847 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.103855 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" Mar 13 15:18:08 crc kubenswrapper[4898]: E0313 15:18:08.103883 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e19347-9341-49c0-9195-97e383796cb3" containerName="oc" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.103890 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e19347-9341-49c0-9195-97e383796cb3" containerName="oc" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.104183 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e19347-9341-49c0-9195-97e383796cb3" containerName="oc" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.104215 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.105159 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.170853 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t9jw\" (UniqueName: \"kubernetes.io/projected/e373e338-4b01-4034-a9a8-186e53b74e76-kube-api-access-7t9jw\") pod \"crc-debug-nj5tr\" (UID: \"e373e338-4b01-4034-a9a8-186e53b74e76\") " pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.171323 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e373e338-4b01-4034-a9a8-186e53b74e76-host\") pod \"crc-debug-nj5tr\" (UID: \"e373e338-4b01-4034-a9a8-186e53b74e76\") " pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.273303 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t9jw\" (UniqueName: \"kubernetes.io/projected/e373e338-4b01-4034-a9a8-186e53b74e76-kube-api-access-7t9jw\") pod \"crc-debug-nj5tr\" (UID: \"e373e338-4b01-4034-a9a8-186e53b74e76\") " pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.273728 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e373e338-4b01-4034-a9a8-186e53b74e76-host\") pod \"crc-debug-nj5tr\" (UID: \"e373e338-4b01-4034-a9a8-186e53b74e76\") " pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.275573 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e373e338-4b01-4034-a9a8-186e53b74e76-host\") pod \"crc-debug-nj5tr\" (UID: \"e373e338-4b01-4034-a9a8-186e53b74e76\") " pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.290426 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t9jw\" (UniqueName: \"kubernetes.io/projected/e373e338-4b01-4034-a9a8-186e53b74e76-kube-api-access-7t9jw\") pod \"crc-debug-nj5tr\" (UID: \"e373e338-4b01-4034-a9a8-186e53b74e76\") " pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.423632 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.936590 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" event={"ID":"e373e338-4b01-4034-a9a8-186e53b74e76","Type":"ContainerStarted","Data":"d463fc3e5f4f8fa2a117295e0dae7a098377d85b8f032ed128d245e63f4a7497"} Mar 13 15:18:09 crc kubenswrapper[4898]: I0313 15:18:09.761528 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada9e0ac-777e-4e64-aade-d729b4481edf" path="/var/lib/kubelet/pods/ada9e0ac-777e-4e64-aade-d729b4481edf/volumes" Mar 13 15:18:22 crc kubenswrapper[4898]: I0313 15:18:22.078057 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" event={"ID":"e373e338-4b01-4034-a9a8-186e53b74e76","Type":"ContainerStarted","Data":"2124311011b6d2dbdb56bd6bbe60e2fefb84036c80bad5c23f6cbdc48089d7e2"} Mar 13 15:18:22 crc kubenswrapper[4898]: I0313 15:18:22.100554 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" podStartSLOduration=1.6922150729999998 podStartE2EDuration="14.100529222s" podCreationTimestamp="2026-03-13 15:18:08 +0000 UTC" firstStartedPulling="2026-03-13 15:18:08.517806048 +0000 UTC m=+4923.519394287" lastFinishedPulling="2026-03-13 15:18:20.926120197 +0000 UTC m=+4935.927708436" observedRunningTime="2026-03-13 15:18:22.092367026 +0000 UTC m=+4937.093955285" watchObservedRunningTime="2026-03-13 15:18:22.100529222 +0000 UTC m=+4937.102117461" Mar 13 15:18:47 crc kubenswrapper[4898]: I0313 15:18:47.861989 4898 scope.go:117] "RemoveContainer" containerID="ac390920dd30738022bc9982651d5e7fa6b628c845272ba7744c86bf9f8444e9" Mar 13 15:18:51 crc kubenswrapper[4898]: I0313 15:18:51.408507 4898 generic.go:334] "Generic (PLEG): container finished" podID="823ccfb8-89eb-409e-9c6c-579bacb35ea1" containerID="7046ebdd84c06a54a3ad07946403d79aec9442bfc8bcd95a1021d0c00e48bec6" exitCode=0 Mar 13 15:18:51 crc kubenswrapper[4898]: I0313 15:18:51.408639 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" event={"ID":"823ccfb8-89eb-409e-9c6c-579bacb35ea1","Type":"ContainerDied","Data":"7046ebdd84c06a54a3ad07946403d79aec9442bfc8bcd95a1021d0c00e48bec6"} Mar 13 15:18:51 crc kubenswrapper[4898]: I0313 15:18:51.408984 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" event={"ID":"823ccfb8-89eb-409e-9c6c-579bacb35ea1","Type":"ContainerStarted","Data":"6f1ae72cefa3d468668ba3267a70b8013c186367da3751e35d5758d5c1261a0a"} Mar 13 15:19:09 crc kubenswrapper[4898]: I0313 15:19:09.081741 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 15:19:09 crc kubenswrapper[4898]: I0313 15:19:09.082349 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 15:19:10 crc kubenswrapper[4898]: I0313 15:19:10.916848 4898 generic.go:334] "Generic (PLEG): container finished" podID="e373e338-4b01-4034-a9a8-186e53b74e76" containerID="2124311011b6d2dbdb56bd6bbe60e2fefb84036c80bad5c23f6cbdc48089d7e2" exitCode=0 Mar 13 15:19:10 crc kubenswrapper[4898]: I0313 15:19:10.916896 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" event={"ID":"e373e338-4b01-4034-a9a8-186e53b74e76","Type":"ContainerDied","Data":"2124311011b6d2dbdb56bd6bbe60e2fefb84036c80bad5c23f6cbdc48089d7e2"} Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.100757 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.146729 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-nj5tr"] Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.156946 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-nj5tr"] Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.211925 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t9jw\" (UniqueName: \"kubernetes.io/projected/e373e338-4b01-4034-a9a8-186e53b74e76-kube-api-access-7t9jw\") pod \"e373e338-4b01-4034-a9a8-186e53b74e76\" (UID: \"e373e338-4b01-4034-a9a8-186e53b74e76\") " Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.212225 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e373e338-4b01-4034-a9a8-186e53b74e76-host\") pod \"e373e338-4b01-4034-a9a8-186e53b74e76\" (UID: \"e373e338-4b01-4034-a9a8-186e53b74e76\") " Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.212284 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e373e338-4b01-4034-a9a8-186e53b74e76-host" (OuterVolumeSpecName: "host") pod "e373e338-4b01-4034-a9a8-186e53b74e76" (UID: "e373e338-4b01-4034-a9a8-186e53b74e76"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.213078 4898 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e373e338-4b01-4034-a9a8-186e53b74e76-host\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.218423 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e373e338-4b01-4034-a9a8-186e53b74e76-kube-api-access-7t9jw" (OuterVolumeSpecName: "kube-api-access-7t9jw") pod "e373e338-4b01-4034-a9a8-186e53b74e76" (UID: "e373e338-4b01-4034-a9a8-186e53b74e76"). InnerVolumeSpecName "kube-api-access-7t9jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.315198 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t9jw\" (UniqueName: \"kubernetes.io/projected/e373e338-4b01-4034-a9a8-186e53b74e76-kube-api-access-7t9jw\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.954551 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d463fc3e5f4f8fa2a117295e0dae7a098377d85b8f032ed128d245e63f4a7497" Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.954653 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.382857 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-2x48p"] Mar 13 15:19:13 crc kubenswrapper[4898]: E0313 15:19:13.383415 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e373e338-4b01-4034-a9a8-186e53b74e76" containerName="container-00" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.383431 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e373e338-4b01-4034-a9a8-186e53b74e76" containerName="container-00" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.383689 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e373e338-4b01-4034-a9a8-186e53b74e76" containerName="container-00" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.384775 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.544422 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjg4v\" (UniqueName: \"kubernetes.io/projected/41d1fb57-3717-403b-a93e-b29818cd7698-kube-api-access-rjg4v\") pod \"crc-debug-2x48p\" (UID: \"41d1fb57-3717-403b-a93e-b29818cd7698\") " pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.544593 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41d1fb57-3717-403b-a93e-b29818cd7698-host\") pod \"crc-debug-2x48p\" (UID: \"41d1fb57-3717-403b-a93e-b29818cd7698\") " pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.649438 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjg4v\" (UniqueName: \"kubernetes.io/projected/41d1fb57-3717-403b-a93e-b29818cd7698-kube-api-access-rjg4v\") pod \"crc-debug-2x48p\" (UID: \"41d1fb57-3717-403b-a93e-b29818cd7698\") " pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.649847 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41d1fb57-3717-403b-a93e-b29818cd7698-host\") pod \"crc-debug-2x48p\" (UID: \"41d1fb57-3717-403b-a93e-b29818cd7698\") " pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.649979 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41d1fb57-3717-403b-a93e-b29818cd7698-host\") pod \"crc-debug-2x48p\" (UID: \"41d1fb57-3717-403b-a93e-b29818cd7698\") " pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.674326 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjg4v\" (UniqueName: \"kubernetes.io/projected/41d1fb57-3717-403b-a93e-b29818cd7698-kube-api-access-rjg4v\") pod \"crc-debug-2x48p\" (UID: \"41d1fb57-3717-403b-a93e-b29818cd7698\") " pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.703072 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.754338 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e373e338-4b01-4034-a9a8-186e53b74e76" path="/var/lib/kubelet/pods/e373e338-4b01-4034-a9a8-186e53b74e76/volumes" Mar 13 15:19:13 crc kubenswrapper[4898]: W0313 15:19:13.765514 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41d1fb57_3717_403b_a93e_b29818cd7698.slice/crio-3eaeea731b6d7736d11e4c22a781a0337eb9d1c5ac72176d1a8a73e7e358a5ed WatchSource:0}: Error finding container 3eaeea731b6d7736d11e4c22a781a0337eb9d1c5ac72176d1a8a73e7e358a5ed: Status 404 returned error can't find the container with id 3eaeea731b6d7736d11e4c22a781a0337eb9d1c5ac72176d1a8a73e7e358a5ed Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.981360 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/crc-debug-2x48p" event={"ID":"41d1fb57-3717-403b-a93e-b29818cd7698","Type":"ContainerStarted","Data":"3eaeea731b6d7736d11e4c22a781a0337eb9d1c5ac72176d1a8a73e7e358a5ed"} Mar 13 15:19:15 crc kubenswrapper[4898]: I0313 15:19:15.000794 4898 generic.go:334] "Generic (PLEG): container finished" podID="41d1fb57-3717-403b-a93e-b29818cd7698" containerID="16ab5a558dadc1a752795f9720b3efadfbd0fc9afffa700ac13e16901fd9b881" exitCode=0 Mar 13 15:19:15 crc kubenswrapper[4898]: I0313 15:19:15.000932 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/crc-debug-2x48p" event={"ID":"41d1fb57-3717-403b-a93e-b29818cd7698","Type":"ContainerDied","Data":"16ab5a558dadc1a752795f9720b3efadfbd0fc9afffa700ac13e16901fd9b881"} Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.160460 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.213047 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-2x48p"] Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.249238 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-2x48p"] Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.318297 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41d1fb57-3717-403b-a93e-b29818cd7698-host\") pod \"41d1fb57-3717-403b-a93e-b29818cd7698\" (UID: \"41d1fb57-3717-403b-a93e-b29818cd7698\") " Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.318467 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41d1fb57-3717-403b-a93e-b29818cd7698-host" (OuterVolumeSpecName: "host") pod "41d1fb57-3717-403b-a93e-b29818cd7698" (UID: "41d1fb57-3717-403b-a93e-b29818cd7698"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.318521 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjg4v\" (UniqueName: \"kubernetes.io/projected/41d1fb57-3717-403b-a93e-b29818cd7698-kube-api-access-rjg4v\") pod \"41d1fb57-3717-403b-a93e-b29818cd7698\" (UID: \"41d1fb57-3717-403b-a93e-b29818cd7698\") " Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.319114 4898 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41d1fb57-3717-403b-a93e-b29818cd7698-host\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.324534 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d1fb57-3717-403b-a93e-b29818cd7698-kube-api-access-rjg4v" (OuterVolumeSpecName: "kube-api-access-rjg4v") pod "41d1fb57-3717-403b-a93e-b29818cd7698" (UID: "41d1fb57-3717-403b-a93e-b29818cd7698"). InnerVolumeSpecName "kube-api-access-rjg4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.422569 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjg4v\" (UniqueName: \"kubernetes.io/projected/41d1fb57-3717-403b-a93e-b29818cd7698-kube-api-access-rjg4v\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.028209 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eaeea731b6d7736d11e4c22a781a0337eb9d1c5ac72176d1a8a73e7e358a5ed" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.028708 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.428535 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-mwtj6"] Mar 13 15:19:17 crc kubenswrapper[4898]: E0313 15:19:17.429099 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d1fb57-3717-403b-a93e-b29818cd7698" containerName="container-00" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.429117 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d1fb57-3717-403b-a93e-b29818cd7698" containerName="container-00" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.429428 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d1fb57-3717-403b-a93e-b29818cd7698" containerName="container-00" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.430430 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.550131 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjkst\" (UniqueName: \"kubernetes.io/projected/36269787-31ca-4f5e-9044-edab989fec71-kube-api-access-hjkst\") pod \"crc-debug-mwtj6\" (UID: \"36269787-31ca-4f5e-9044-edab989fec71\") " pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.550738 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36269787-31ca-4f5e-9044-edab989fec71-host\") pod \"crc-debug-mwtj6\" (UID: \"36269787-31ca-4f5e-9044-edab989fec71\") " pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.653570 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36269787-31ca-4f5e-9044-edab989fec71-host\") pod \"crc-debug-mwtj6\" (UID: \"36269787-31ca-4f5e-9044-edab989fec71\") " pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.653716 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjkst\" (UniqueName: \"kubernetes.io/projected/36269787-31ca-4f5e-9044-edab989fec71-kube-api-access-hjkst\") pod \"crc-debug-mwtj6\" (UID: \"36269787-31ca-4f5e-9044-edab989fec71\") " pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.653760 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36269787-31ca-4f5e-9044-edab989fec71-host\") pod \"crc-debug-mwtj6\" (UID: \"36269787-31ca-4f5e-9044-edab989fec71\") " pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.755794 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d1fb57-3717-403b-a93e-b29818cd7698" path="/var/lib/kubelet/pods/41d1fb57-3717-403b-a93e-b29818cd7698/volumes" Mar 13 15:19:18 crc kubenswrapper[4898]: I0313 15:19:18.131261 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjkst\" (UniqueName: \"kubernetes.io/projected/36269787-31ca-4f5e-9044-edab989fec71-kube-api-access-hjkst\") pod \"crc-debug-mwtj6\" (UID: \"36269787-31ca-4f5e-9044-edab989fec71\") " pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:18 crc kubenswrapper[4898]: I0313 15:19:18.350877 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:18 crc kubenswrapper[4898]: W0313 15:19:18.380546 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36269787_31ca_4f5e_9044_edab989fec71.slice/crio-82b69365c504b9163413bfdb884a969cd476d34070236577bb1192adf7f82a8e WatchSource:0}: Error finding container 82b69365c504b9163413bfdb884a969cd476d34070236577bb1192adf7f82a8e: Status 404 returned error can't find the container with id 82b69365c504b9163413bfdb884a969cd476d34070236577bb1192adf7f82a8e Mar 13 15:19:19 crc kubenswrapper[4898]: I0313 15:19:19.050049 4898 generic.go:334] "Generic (PLEG): container finished" podID="36269787-31ca-4f5e-9044-edab989fec71" containerID="574ee975060e59643e7318bfce9cb30a5cb62d4a53140e192038f6b50584150c" exitCode=0 Mar 13 15:19:19 crc kubenswrapper[4898]: I0313 15:19:19.050160 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" event={"ID":"36269787-31ca-4f5e-9044-edab989fec71","Type":"ContainerDied","Data":"574ee975060e59643e7318bfce9cb30a5cb62d4a53140e192038f6b50584150c"} Mar 13 15:19:19 crc kubenswrapper[4898]: I0313 15:19:19.050410 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" event={"ID":"36269787-31ca-4f5e-9044-edab989fec71","Type":"ContainerStarted","Data":"82b69365c504b9163413bfdb884a969cd476d34070236577bb1192adf7f82a8e"} Mar 13 15:19:19 crc kubenswrapper[4898]: I0313 15:19:19.103756 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-mwtj6"] Mar 13 15:19:19 crc kubenswrapper[4898]: I0313 15:19:19.120416 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-mwtj6"] Mar 13 15:19:20 crc kubenswrapper[4898]: I0313 15:19:20.236160 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:20 crc kubenswrapper[4898]: I0313 15:19:20.317784 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjkst\" (UniqueName: \"kubernetes.io/projected/36269787-31ca-4f5e-9044-edab989fec71-kube-api-access-hjkst\") pod \"36269787-31ca-4f5e-9044-edab989fec71\" (UID: \"36269787-31ca-4f5e-9044-edab989fec71\") " Mar 13 15:19:20 crc kubenswrapper[4898]: I0313 15:19:20.317957 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36269787-31ca-4f5e-9044-edab989fec71-host\") pod \"36269787-31ca-4f5e-9044-edab989fec71\" (UID: \"36269787-31ca-4f5e-9044-edab989fec71\") " Mar 13 15:19:20 crc kubenswrapper[4898]: I0313 15:19:20.318577 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36269787-31ca-4f5e-9044-edab989fec71-host" (OuterVolumeSpecName: "host") pod "36269787-31ca-4f5e-9044-edab989fec71" (UID: "36269787-31ca-4f5e-9044-edab989fec71"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:19:20 crc kubenswrapper[4898]: I0313 15:19:20.324432 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36269787-31ca-4f5e-9044-edab989fec71-kube-api-access-hjkst" (OuterVolumeSpecName: "kube-api-access-hjkst") pod "36269787-31ca-4f5e-9044-edab989fec71" (UID: "36269787-31ca-4f5e-9044-edab989fec71"). InnerVolumeSpecName "kube-api-access-hjkst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:19:20 crc kubenswrapper[4898]: I0313 15:19:20.421997 4898 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36269787-31ca-4f5e-9044-edab989fec71-host\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:20 crc kubenswrapper[4898]: I0313 15:19:20.422039 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjkst\" (UniqueName: \"kubernetes.io/projected/36269787-31ca-4f5e-9044-edab989fec71-kube-api-access-hjkst\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:21 crc kubenswrapper[4898]: I0313 15:19:21.082589 4898 scope.go:117] "RemoveContainer" containerID="574ee975060e59643e7318bfce9cb30a5cb62d4a53140e192038f6b50584150c" Mar 13 15:19:21 crc kubenswrapper[4898]: I0313 15:19:21.082614 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:21 crc kubenswrapper[4898]: I0313 15:19:21.752836 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36269787-31ca-4f5e-9044-edab989fec71" path="/var/lib/kubelet/pods/36269787-31ca-4f5e-9044-edab989fec71/volumes" Mar 13 15:19:29 crc kubenswrapper[4898]: I0313 15:19:29.087365 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 15:19:29 crc kubenswrapper[4898]: I0313 15:19:29.093047 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.055445 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a27645af-4d4a-4a73-ba8a-488a9ae199ac/aodh-api/0.log" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.133985 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.134055 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.343762 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a27645af-4d4a-4a73-ba8a-488a9ae199ac/aodh-listener/0.log" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.381606 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a27645af-4d4a-4a73-ba8a-488a9ae199ac/aodh-evaluator/0.log" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.405131 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a27645af-4d4a-4a73-ba8a-488a9ae199ac/aodh-notifier/0.log" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.555627 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b9dc95d4b-bvhlz_fd4bf680-c8b7-4721-9595-9a8ed40410d2/barbican-api/0.log" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.627396 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b9dc95d4b-bvhlz_fd4bf680-c8b7-4721-9595-9a8ed40410d2/barbican-api-log/0.log" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.705505 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-fcdc98bd8-xdl6x_272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2/barbican-keystone-listener/0.log" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.862682 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-fcdc98bd8-xdl6x_272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2/barbican-keystone-listener-log/0.log" Mar 13 15:19:50 crc kubenswrapper[4898]: I0313 15:19:50.564934 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-795749dc8c-sm2hl_8b16e588-d353-4100-b143-b84420c42e30/barbican-worker/0.log" Mar 13 15:19:50 crc kubenswrapper[4898]: I0313 15:19:50.629811 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-795749dc8c-sm2hl_8b16e588-d353-4100-b143-b84420c42e30/barbican-worker-log/0.log" Mar 13 15:19:50 crc kubenswrapper[4898]: I0313 15:19:50.839684 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf_6d8bbc5a-39da-48b8-82d1-6df496fda612/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:50 crc kubenswrapper[4898]: I0313 15:19:50.864596 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_02f7d483-aecb-4a39-babc-6d9598090c4b/ceilometer-central-agent/1.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.015194 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_02f7d483-aecb-4a39-babc-6d9598090c4b/ceilometer-central-agent/0.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.065971 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_02f7d483-aecb-4a39-babc-6d9598090c4b/ceilometer-notification-agent/0.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.107109 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_02f7d483-aecb-4a39-babc-6d9598090c4b/proxy-httpd/0.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.127783 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_02f7d483-aecb-4a39-babc-6d9598090c4b/sg-core/0.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.312033 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_bda33d23-490a-4099-954b-c613ab5d5c73/cinder-api-log/0.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.356919 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_bda33d23-490a-4099-954b-c613ab5d5c73/cinder-api/0.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.563615 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a9a7064c-4ed5-4948-9e7e-7d40794e371e/cinder-scheduler/1.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.603432 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a9a7064c-4ed5-4948-9e7e-7d40794e371e/cinder-scheduler/0.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.650521 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a9a7064c-4ed5-4948-9e7e-7d40794e371e/probe/0.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.786704 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-57hwn_295e7c32-75f1-4eee-a126-2d4547c56f24/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:52 crc kubenswrapper[4898]: I0313 15:19:52.528478 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz_ac094822-6272-4730-ab0b-16f0116426b5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:52 crc kubenswrapper[4898]: I0313 15:19:52.592066 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-k4ntr_dd51a575-1651-4891-941f-3e0fe447e81d/init/0.log" Mar 13 15:19:52 crc kubenswrapper[4898]: I0313 15:19:52.805471 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-k4ntr_dd51a575-1651-4891-941f-3e0fe447e81d/init/0.log" Mar 13 15:19:52 crc kubenswrapper[4898]: I0313 15:19:52.886926 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-k4ntr_dd51a575-1651-4891-941f-3e0fe447e81d/dnsmasq-dns/0.log" Mar 13 15:19:52 crc kubenswrapper[4898]: I0313 15:19:52.906594 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-rbttg_05e315eb-34b1-4099-b676-b0238f3cb5c5/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:53 crc kubenswrapper[4898]: I0313 15:19:53.103235 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a7cdbc1c-79cc-441b-a08c-c61b717d82c9/glance-log/0.log" Mar 13 15:19:53 crc kubenswrapper[4898]: I0313 15:19:53.128880 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a7cdbc1c-79cc-441b-a08c-c61b717d82c9/glance-httpd/0.log" Mar 13 15:19:53 crc kubenswrapper[4898]: I0313 15:19:53.309847 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f666d519-2c39-4e93-823d-e5a3fcfd0d5a/glance-log/0.log" Mar 13 15:19:53 crc kubenswrapper[4898]: I0313 15:19:53.418556 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f666d519-2c39-4e93-823d-e5a3fcfd0d5a/glance-httpd/0.log" Mar 13 15:19:53 crc kubenswrapper[4898]: I0313 15:19:53.934247 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-648cbb8b5f-4kb5b_739e9c4a-9843-4edf-a045-2f7ef8d15b5e/heat-api/0.log" Mar 13 15:19:53 crc kubenswrapper[4898]: I0313 15:19:53.962301 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6b446d7755-5724r_0f20ec1d-823e-4695-859e-bdc538e602d9/heat-engine/0.log" Mar 13 15:19:54 crc kubenswrapper[4898]: I0313 15:19:54.007529 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-v8np4_8c6bec5a-faac-4793-8c18-9f5b2faf2c95/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:54 crc kubenswrapper[4898]: I0313 15:19:54.159282 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-5df9b5999-7tt4b_03c552ae-5860-4468-a612-7af3d3587df4/heat-cfnapi/0.log" Mar 13 15:19:54 crc kubenswrapper[4898]: I0313 15:19:54.230763 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-tr644_8b4abb6a-5797-47be-96a0-69173649e5fa/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:54 crc kubenswrapper[4898]: I0313 15:19:54.442617 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29556901-6pnrd_3d71da57-c929-47d7-89bd-8e4e3c7f3ca0/keystone-cron/0.log" Mar 13 15:19:54 crc kubenswrapper[4898]: I0313 15:19:54.496571 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_b7452a36-0169-4cfe-9ede-ef4d0ef072d9/kube-state-metrics/0.log" Mar 13 15:19:54 crc kubenswrapper[4898]: I0313 15:19:54.794420 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-tw885_efff948d-3073-4635-bc2c-2a8fc746c6b8/logging-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:54 crc kubenswrapper[4898]: I0313 15:19:54.825474 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z_226c01c4-d0f3-4784-8e93-36d1de6d593f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:55 crc kubenswrapper[4898]: I0313 15:19:55.067240 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_8a198c14-e13f-4858-87c4-de6be0fa8d0c/mysqld-exporter/0.log" Mar 13 15:19:55 crc kubenswrapper[4898]: I0313 15:19:55.140102 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-87574c74-kqmjb_d149c7e3-df46-44b5-8a66-8a0fbb5a8554/keystone-api/0.log" Mar 13 15:19:55 crc kubenswrapper[4898]: I0313 15:19:55.412297 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-776df44c77-g64lv_4a679fb4-8d85-4835-a048-08c4b61aa158/neutron-api/0.log" Mar 13 15:19:55 crc kubenswrapper[4898]: I0313 15:19:55.500829 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2_abb37cb2-ec06-4c96-882f-7781fbe053e0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:55 crc kubenswrapper[4898]: I0313 15:19:55.526826 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-776df44c77-g64lv_4a679fb4-8d85-4835-a048-08c4b61aa158/neutron-httpd/0.log" Mar 13 15:19:56 crc kubenswrapper[4898]: I0313 15:19:56.209546 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ef7dd576-1005-4fdb-95c1-e5da9f04b177/nova-api-log/0.log" Mar 13 15:19:56 crc kubenswrapper[4898]: I0313 15:19:56.218188 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_9796fb40-37f0-4d8a-929f-4bb6295388a4/nova-cell0-conductor-conductor/0.log" Mar 13 15:19:56 crc kubenswrapper[4898]: I0313 15:19:56.489046 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_50cbae0e-4bf9-41b0-8c87-b551f782aecf/nova-cell1-conductor-conductor/0.log" Mar 13 15:19:56 crc kubenswrapper[4898]: I0313 15:19:56.580077 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_041221f0-b346-4310-ab8e-a8f2440c6034/nova-cell1-novncproxy-novncproxy/0.log" Mar 13 15:19:56 crc kubenswrapper[4898]: I0313 15:19:56.678836 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ef7dd576-1005-4fdb-95c1-e5da9f04b177/nova-api-api/0.log" Mar 13 15:19:56 crc kubenswrapper[4898]: I0313 15:19:56.851717 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-28xpg_acaa3912-3e27-4272-8e4a-3ab67fd34b92/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:56 crc kubenswrapper[4898]: I0313 15:19:56.956406 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17/nova-metadata-log/0.log" Mar 13 15:19:57 crc kubenswrapper[4898]: I0313 15:19:57.394459 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_97d388e1-b1b3-409d-b7c5-38b37734a8e6/nova-scheduler-scheduler/0.log" Mar 13 15:19:57 crc kubenswrapper[4898]: I0313 15:19:57.611701 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17/nova-metadata-metadata/0.log" Mar 13 15:19:57 crc kubenswrapper[4898]: I0313 15:19:57.662524 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f/mysql-bootstrap/0.log" Mar 13 15:19:57 crc kubenswrapper[4898]: I0313 15:19:57.966652 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f/galera/0.log" Mar 13 15:19:57 crc kubenswrapper[4898]: I0313 15:19:57.979214 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f/mysql-bootstrap/0.log" Mar 13 15:19:58 crc kubenswrapper[4898]: I0313 15:19:58.108717 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f/galera/1.log" Mar 13 15:19:58 crc kubenswrapper[4898]: I0313 15:19:58.198669 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e5d53cf3-113e-4391-b3a9-4e1f81836e26/mysql-bootstrap/0.log" Mar 13 15:19:58 crc kubenswrapper[4898]: I0313 15:19:58.490755 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e5d53cf3-113e-4391-b3a9-4e1f81836e26/mysql-bootstrap/0.log" Mar 13 15:19:58 crc kubenswrapper[4898]: I0313 15:19:58.546772 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e5d53cf3-113e-4391-b3a9-4e1f81836e26/galera/0.log" Mar 13 15:19:58 crc kubenswrapper[4898]: I0313 15:19:58.630695 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e5d53cf3-113e-4391-b3a9-4e1f81836e26/galera/1.log" Mar 13 15:19:58 crc kubenswrapper[4898]: I0313 15:19:58.793286 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_124bd4ee-d9f0-408f-a46e-4d143e8ab02a/openstackclient/0.log" Mar 13 15:19:58 crc kubenswrapper[4898]: I0313 15:19:58.811219 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-j79bj_a506ef1a-354a-49c8-b63d-4db4b9ecdcfe/ovn-controller/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.132691 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8mxxb_515cda05-1d7b-4252-94fc-056b38ec502a/openstack-network-exporter/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.170609 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-r9tmf_f71b72a8-f179-454c-8d2e-4ac829842622/ovsdb-server-init/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.338711 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-r9tmf_f71b72a8-f179-454c-8d2e-4ac829842622/ovsdb-server-init/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.380806 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-r9tmf_f71b72a8-f179-454c-8d2e-4ac829842622/ovs-vswitchd/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.489388 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-r9tmf_f71b72a8-f179-454c-8d2e-4ac829842622/ovsdb-server/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.567191 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-6xdgx_a9f7be15-746c-45be-92a1-2fa2a961f636/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.730885 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_902753c9-2101-4509-9283-55070ac3787e/openstack-network-exporter/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.735435 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_902753c9-2101-4509-9283-55070ac3787e/ovn-northd/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.926404 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10/openstack-network-exporter/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.982550 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10/ovsdbserver-nb/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.158270 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556920-jxmxv"] Mar 13 15:20:00 crc kubenswrapper[4898]: E0313 15:20:00.160088 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36269787-31ca-4f5e-9044-edab989fec71" containerName="container-00" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.160105 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="36269787-31ca-4f5e-9044-edab989fec71" containerName="container-00" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.160794 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="36269787-31ca-4f5e-9044-edab989fec71" containerName="container-00" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.163955 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.175416 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556920-jxmxv"] Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.180870 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.181064 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.183002 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.224300 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_111bf23f-be00-46ab-97fe-a36465735164/openstack-network-exporter/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.258315 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_111bf23f-be00-46ab-97fe-a36465735164/ovsdbserver-sb/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.302859 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl5gv\" (UniqueName: \"kubernetes.io/projected/6caf987f-dbe2-48d2-8138-107de40fe224-kube-api-access-vl5gv\") pod \"auto-csr-approver-29556920-jxmxv\" (UID: \"6caf987f-dbe2-48d2-8138-107de40fe224\") " pod="openshift-infra/auto-csr-approver-29556920-jxmxv" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.404688 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl5gv\" (UniqueName: \"kubernetes.io/projected/6caf987f-dbe2-48d2-8138-107de40fe224-kube-api-access-vl5gv\") pod \"auto-csr-approver-29556920-jxmxv\" (UID: \"6caf987f-dbe2-48d2-8138-107de40fe224\") " pod="openshift-infra/auto-csr-approver-29556920-jxmxv" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.440762 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl5gv\" (UniqueName: \"kubernetes.io/projected/6caf987f-dbe2-48d2-8138-107de40fe224-kube-api-access-vl5gv\") pod \"auto-csr-approver-29556920-jxmxv\" (UID: \"6caf987f-dbe2-48d2-8138-107de40fe224\") " pod="openshift-infra/auto-csr-approver-29556920-jxmxv" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.448284 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-647f998784-xvcjw_fa7825b5-b19b-44bb-8d23-bb121e669780/placement-api/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.517571 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.631868 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d555bd54-f4d5-4b06-9517-32b4fe687f4b/init-config-reloader/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.679440 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-647f998784-xvcjw_fa7825b5-b19b-44bb-8d23-bb121e669780/placement-log/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.926890 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d555bd54-f4d5-4b06-9517-32b4fe687f4b/thanos-sidecar/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.939878 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d555bd54-f4d5-4b06-9517-32b4fe687f4b/init-config-reloader/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.974559 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d555bd54-f4d5-4b06-9517-32b4fe687f4b/prometheus/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.995159 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d555bd54-f4d5-4b06-9517-32b4fe687f4b/config-reloader/0.log" Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.164400 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556920-jxmxv"] Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.191370 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835/setup-container/0.log" Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.452107 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835/setup-container/0.log" Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.533108 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835/rabbitmq/0.log" Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.572026 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_10c321a0-5ea5-4b5c-8695-1f7b2dcad32b/setup-container/0.log" Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.664233 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" event={"ID":"6caf987f-dbe2-48d2-8138-107de40fe224","Type":"ContainerStarted","Data":"da6e0c31d7181dd8304aed4238096af773ce4f2619cef6ffbbe314456d9083a4"} Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.850965 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_10c321a0-5ea5-4b5c-8695-1f7b2dcad32b/setup-container/0.log" Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.918955 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_10c321a0-5ea5-4b5c-8695-1f7b2dcad32b/rabbitmq/0.log" Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.922535 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_ec19264c-1313-492d-b59b-4e5916b988f5/setup-container/0.log" Mar 13 15:20:02 crc kubenswrapper[4898]: I0313 15:20:02.190446 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_ec19264c-1313-492d-b59b-4e5916b988f5/setup-container/0.log" Mar 13 15:20:02 crc kubenswrapper[4898]: I0313 15:20:02.218148 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_ec19264c-1313-492d-b59b-4e5916b988f5/rabbitmq/0.log" Mar 13 15:20:02 crc kubenswrapper[4898]: I0313 15:20:02.233097 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_8d188301-848c-4cf6-a204-e1110714c1be/setup-container/0.log" Mar 13 15:20:02 crc kubenswrapper[4898]: I0313 15:20:02.409943 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_8d188301-848c-4cf6-a204-e1110714c1be/setup-container/0.log" Mar 13 15:20:02 crc kubenswrapper[4898]: I0313 15:20:02.501386 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2_8a674c4a-b209-4ea0-83b0-c46f820a81ef/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:20:02 crc kubenswrapper[4898]: I0313 15:20:02.587455 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_8d188301-848c-4cf6-a204-e1110714c1be/rabbitmq/0.log" Mar 13 15:20:02 crc kubenswrapper[4898]: I0313 15:20:02.731351 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-g6vnq_6329b434-b1be-4490-9a50-351366b18d79/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:20:02 crc kubenswrapper[4898]: I0313 15:20:02.861978 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs_98336335-4b60-4ddf-8fe8-4ea6b69d47ef/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:20:03 crc kubenswrapper[4898]: I0313 15:20:03.006990 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-sjx4b_05d3f0e4-c029-4e2f-a3c1-471faa671767/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:20:03 crc kubenswrapper[4898]: I0313 15:20:03.141586 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-x8wvs_e7c70549-1fc7-42c2-8c81-075c611671ae/ssh-known-hosts-edpm-deployment/0.log" Mar 13 15:20:03 crc kubenswrapper[4898]: I0313 15:20:03.447669 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7f9cbdc5df-5tx5z_1a57db04-0dc9-4d63-8d08-dd4309b19496/proxy-server/0.log" Mar 13 15:20:03 crc kubenswrapper[4898]: I0313 15:20:03.556987 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-ztbp9_4a6f0bfb-5db5-440c-a93f-0d6fe159401d/swift-ring-rebalance/0.log" Mar 13 15:20:03 crc kubenswrapper[4898]: I0313 15:20:03.583408 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7f9cbdc5df-5tx5z_1a57db04-0dc9-4d63-8d08-dd4309b19496/proxy-httpd/0.log" Mar 13 15:20:03 crc kubenswrapper[4898]: I0313 15:20:03.804693 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/account-reaper/0.log" Mar 13 15:20:03 crc kubenswrapper[4898]: I0313 15:20:03.807187 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/account-auditor/0.log" Mar 13 15:20:03 crc kubenswrapper[4898]: I0313 15:20:03.950733 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/account-replicator/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.007601 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/account-server/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.034325 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/container-auditor/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.127736 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/container-replicator/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.216336 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/container-server/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.236320 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/container-updater/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.314555 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/object-auditor/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.382646 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/object-expirer/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.513640 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/object-server/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.527805 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/object-replicator/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.576854 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/object-updater/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.676359 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/rsync/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.713139 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" event={"ID":"6caf987f-dbe2-48d2-8138-107de40fe224","Type":"ContainerStarted","Data":"897e6256c632c42295b8912f57e8c6461493cccdecec91e50f43154fdcb913e4"} Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.732532 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" podStartSLOduration=2.72588178 podStartE2EDuration="4.732508737s" podCreationTimestamp="2026-03-13 15:20:00 +0000 UTC" firstStartedPulling="2026-03-13 15:20:01.181286201 +0000 UTC m=+5036.182874440" lastFinishedPulling="2026-03-13 15:20:03.187913158 +0000 UTC m=+5038.189501397" observedRunningTime="2026-03-13 15:20:04.725253395 +0000 UTC m=+5039.726841634" watchObservedRunningTime="2026-03-13 15:20:04.732508737 +0000 UTC m=+5039.734096986" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.757601 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/swift-recon-cron/0.log" Mar 13 15:20:05 crc kubenswrapper[4898]: I0313 15:20:05.033040 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk_9a62fd58-a586-4473-abfe-4e227cad9900/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:20:05 crc kubenswrapper[4898]: I0313 15:20:05.116518 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6_5139c85e-1d3d-4fe7-94aa-8efde03b43e0/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:20:05 crc kubenswrapper[4898]: I0313 15:20:05.343250 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_382b2b09-8110-411f-9d86-53e73df67fe6/test-operator-logs-container/0.log" Mar 13 15:20:05 crc kubenswrapper[4898]: I0313 15:20:05.596703 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c_271e9163-4e9c-4c79-a0b4-be373e97956c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:20:05 crc kubenswrapper[4898]: I0313 15:20:05.693542 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d19e8770-f0c1-491e-96c9-f737386ab3b0/tempest-tests-tempest-tests-runner/0.log" Mar 13 15:20:05 crc kubenswrapper[4898]: I0313 15:20:05.726067 4898 generic.go:334] "Generic (PLEG): container finished" podID="6caf987f-dbe2-48d2-8138-107de40fe224" containerID="897e6256c632c42295b8912f57e8c6461493cccdecec91e50f43154fdcb913e4" exitCode=0 Mar 13 15:20:05 crc kubenswrapper[4898]: I0313 15:20:05.726118 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" event={"ID":"6caf987f-dbe2-48d2-8138-107de40fe224","Type":"ContainerDied","Data":"897e6256c632c42295b8912f57e8c6461493cccdecec91e50f43154fdcb913e4"} Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.176689 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.323410 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl5gv\" (UniqueName: \"kubernetes.io/projected/6caf987f-dbe2-48d2-8138-107de40fe224-kube-api-access-vl5gv\") pod \"6caf987f-dbe2-48d2-8138-107de40fe224\" (UID: \"6caf987f-dbe2-48d2-8138-107de40fe224\") " Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.330635 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6caf987f-dbe2-48d2-8138-107de40fe224-kube-api-access-vl5gv" (OuterVolumeSpecName: "kube-api-access-vl5gv") pod "6caf987f-dbe2-48d2-8138-107de40fe224" (UID: "6caf987f-dbe2-48d2-8138-107de40fe224"). InnerVolumeSpecName "kube-api-access-vl5gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.426276 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl5gv\" (UniqueName: \"kubernetes.io/projected/6caf987f-dbe2-48d2-8138-107de40fe224-kube-api-access-vl5gv\") on node \"crc\" DevicePath \"\"" Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.757300 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.760867 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" event={"ID":"6caf987f-dbe2-48d2-8138-107de40fe224","Type":"ContainerDied","Data":"da6e0c31d7181dd8304aed4238096af773ce4f2619cef6ffbbe314456d9083a4"} Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.760935 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da6e0c31d7181dd8304aed4238096af773ce4f2619cef6ffbbe314456d9083a4" Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.841883 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556914-52q7k"] Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.852940 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556914-52q7k"] Mar 13 15:20:09 crc kubenswrapper[4898]: I0313 15:20:09.757206 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad898ac1-9e95-4eb8-a88b-927e3d6364f6" path="/var/lib/kubelet/pods/ad898ac1-9e95-4eb8-a88b-927e3d6364f6/volumes" Mar 13 15:20:19 crc kubenswrapper[4898]: I0313 15:20:19.133883 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:20:19 crc kubenswrapper[4898]: I0313 15:20:19.134461 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:20:20 crc kubenswrapper[4898]: I0313 15:20:20.189233 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_67ef28b0-acc3-400e-8296-a541fc3b89f0/memcached/0.log" Mar 13 15:20:38 crc kubenswrapper[4898]: I0313 15:20:38.740292 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-d47688694-gtlps_45efd8ce-26db-4511-bd88-2e7467d02bbb/manager/0.log" Mar 13 15:20:38 crc kubenswrapper[4898]: I0313 15:20:38.952949 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-4n5rx_3c955ebc-98fd-4921-9923-6151a50e8eec/manager/0.log" Mar 13 15:20:39 crc kubenswrapper[4898]: I0313 15:20:39.216630 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57_a57932fc-ce83-4258-95a8-65f29c0cfd5a/util/0.log" Mar 13 15:20:39 crc kubenswrapper[4898]: I0313 15:20:39.452406 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57_a57932fc-ce83-4258-95a8-65f29c0cfd5a/util/0.log" Mar 13 15:20:39 crc kubenswrapper[4898]: I0313 15:20:39.454302 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57_a57932fc-ce83-4258-95a8-65f29c0cfd5a/pull/0.log" Mar 13 15:20:39 crc kubenswrapper[4898]: I0313 15:20:39.454952 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57_a57932fc-ce83-4258-95a8-65f29c0cfd5a/pull/0.log" Mar 13 15:20:39 crc kubenswrapper[4898]: I0313 15:20:39.722585 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57_a57932fc-ce83-4258-95a8-65f29c0cfd5a/util/0.log" Mar 13 15:20:39 crc kubenswrapper[4898]: I0313 15:20:39.758535 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57_a57932fc-ce83-4258-95a8-65f29c0cfd5a/pull/0.log" Mar 13 15:20:39 crc kubenswrapper[4898]: I0313 15:20:39.775718 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57_a57932fc-ce83-4258-95a8-65f29c0cfd5a/extract/0.log" Mar 13 15:20:40 crc kubenswrapper[4898]: I0313 15:20:40.100074 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-mf8h6_fb7b2f97-fca8-41d2-9be7-d40fac94c171/manager/0.log" Mar 13 15:20:40 crc kubenswrapper[4898]: I0313 15:20:40.374891 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-tqp4b_ea0ad033-9a48-4e42-a237-f27cacf03adc/manager/0.log" Mar 13 15:20:40 crc kubenswrapper[4898]: I0313 15:20:40.407618 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-jngrl_a80d01d5-0201-4b2e-974c-ac5b42ac8df4/manager/1.log" Mar 13 15:20:40 crc kubenswrapper[4898]: I0313 15:20:40.720994 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-p9d5v_0d88a5d2-a852-409e-b4bd-939d1c2b9090/manager/0.log" Mar 13 15:20:40 crc kubenswrapper[4898]: I0313 15:20:40.728254 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-jngrl_a80d01d5-0201-4b2e-974c-ac5b42ac8df4/manager/0.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.003532 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bc894d9b-v99bm_32b5ebfd-38d9-456e-bb21-7332323239d1/manager/1.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.059122 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bc894d9b-v99bm_32b5ebfd-38d9-456e-bb21-7332323239d1/manager/0.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.120205 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-54dc5b8f8d-8kcsw_c35de09d-7f21-47d3-aac5-a26b15b0a496/manager/0.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.366357 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-s5zh6_d24bb749-0b71-456b-80e4-fdf6dd23ba30/manager/0.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.412268 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-57b484b4df-z2gd2_1df4a7d6-b0c2-4b00-b591-1a612bd319b6/manager/0.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.582006 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc_ba56f415-73d5-4301-a25d-0e5d1ba4e3b1/manager/0.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.695345 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-ntlw6_d71982c0-a3d0-4da8-84cd-7494301f589f/manager/0.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.881380 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-s2rdh_d29ce3ee-3d5a-4801-abf9-dfef5b641a74/manager/1.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.923916 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7f84474648-mr4wv_52959483-daae-423a-a3bf-8e3fa7810074/manager/0.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.961267 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-s2rdh_d29ce3ee-3d5a-4801-abf9-dfef5b641a74/manager/0.log" Mar 13 15:20:42 crc kubenswrapper[4898]: I0313 15:20:42.089744 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv_0ab852e1-fd26-4f76-b758-77896f8e236b/manager/1.log" Mar 13 15:20:42 crc kubenswrapper[4898]: I0313 15:20:42.177018 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv_0ab852e1-fd26-4f76-b758-77896f8e236b/manager/0.log" Mar 13 15:20:42 crc kubenswrapper[4898]: I0313 15:20:42.453556 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6b8c6b5df9-kk2gn_7bae49ab-1146-43a2-b436-69838c923f1a/operator/0.log" Mar 13 15:20:42 crc kubenswrapper[4898]: I0313 15:20:42.678375 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-9k7p6_478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8/registry-server/0.log" Mar 13 15:20:42 crc kubenswrapper[4898]: I0313 15:20:42.860251 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-wdmrh_da3795a7-363f-4637-afe2-77cb77248f9a/manager/0.log" Mar 13 15:20:43 crc kubenswrapper[4898]: I0313 15:20:43.006924 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-njsvh_0d7c657b-a701-41fe-9b23-d5bba3302c4f/manager/0.log" Mar 13 15:20:43 crc kubenswrapper[4898]: I0313 15:20:43.380679 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-82gtc_7b9c0413-5558-43c4-805b-7f035fded9b4/operator/0.log" Mar 13 15:20:43 crc kubenswrapper[4898]: I0313 15:20:43.438843 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7f9cc5dd44-f2t6t_66a86c31-9ff3-439a-a0f8-96c981014b6f/manager/0.log" Mar 13 15:20:43 crc kubenswrapper[4898]: I0313 15:20:43.726395 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-smdkt_19a0f4de-5258-4f2b-9587-71293459378e/manager/1.log" Mar 13 15:20:43 crc kubenswrapper[4898]: I0313 15:20:43.867682 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-smdkt_19a0f4de-5258-4f2b-9587-71293459378e/manager/0.log" Mar 13 15:20:43 crc kubenswrapper[4898]: I0313 15:20:43.991732 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-jwrd2_919747b8-a031-4654-999f-3c3928f981b4/manager/0.log" Mar 13 15:20:44 crc kubenswrapper[4898]: I0313 15:20:44.048382 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5b9fbd87f-s2k96_9ff6f89a-7110-42fb-96b9-8611f280bebe/manager/0.log" Mar 13 15:20:44 crc kubenswrapper[4898]: I0313 15:20:44.236547 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5f7dc44db6-9nsrh_3a26728d-85c2-465c-bce4-c74045ea9e0d/manager/0.log" Mar 13 15:20:48 crc kubenswrapper[4898]: I0313 15:20:48.407501 4898 scope.go:117] "RemoveContainer" containerID="7e1660e6d2126df6f52c14a1146a22533f711db09e61120239f48e0f06547dd2" Mar 13 15:20:49 crc kubenswrapper[4898]: I0313 15:20:49.134292 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:20:49 crc kubenswrapper[4898]: I0313 15:20:49.134663 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:20:49 crc kubenswrapper[4898]: I0313 15:20:49.134735 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 15:20:49 crc kubenswrapper[4898]: I0313 15:20:49.136269 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:20:49 crc kubenswrapper[4898]: I0313 15:20:49.136374 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" gracePeriod=600 Mar 13 15:20:49 crc kubenswrapper[4898]: E0313 15:20:49.295801 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:20:50 crc kubenswrapper[4898]: I0313 15:20:50.283976 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" exitCode=0 Mar 13 15:20:50 crc kubenswrapper[4898]: I0313 15:20:50.284018 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668"} Mar 13 15:20:50 crc kubenswrapper[4898]: I0313 15:20:50.284277 4898 scope.go:117] "RemoveContainer" containerID="bf9fa5bb76f8bd5a010d026caf62189a87b342669ddb0345c62f785750fd30c1" Mar 13 15:20:50 crc kubenswrapper[4898]: I0313 15:20:50.285686 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:20:50 crc kubenswrapper[4898]: E0313 15:20:50.286160 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:21:01 crc kubenswrapper[4898]: I0313 15:21:01.739733 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:21:01 crc kubenswrapper[4898]: E0313 15:21:01.740790 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:21:06 crc kubenswrapper[4898]: I0313 15:21:06.473324 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qt7gm_6444bf97-84ef-49df-afcd-4e939a5de2ad/control-plane-machine-set-operator/0.log" Mar 13 15:21:06 crc kubenswrapper[4898]: I0313 15:21:06.502763 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-whtgq_096d3786-85e8-4fe5-82b3-57cd1be251a1/kube-rbac-proxy/0.log" Mar 13 15:21:06 crc kubenswrapper[4898]: I0313 15:21:06.921093 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-whtgq_096d3786-85e8-4fe5-82b3-57cd1be251a1/machine-api-operator/0.log" Mar 13 15:21:14 crc kubenswrapper[4898]: I0313 15:21:14.739197 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:21:14 crc kubenswrapper[4898]: E0313 15:21:14.739922 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:21:21 crc kubenswrapper[4898]: I0313 15:21:21.735775 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-fsdjh_b267a865-1a03-4f37-9d2a-83380d30da1d/cert-manager-controller/0.log" Mar 13 15:21:21 crc kubenswrapper[4898]: I0313 15:21:21.894938 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-krzxz_d00b7135-a080-4f0e-a23b-237ab821410f/cert-manager-cainjector/0.log" Mar 13 15:21:21 crc kubenswrapper[4898]: I0313 15:21:21.992940 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-cx9pw_7c1fa9c0-bb2e-4806-95fd-07fba426bdc8/cert-manager-webhook/0.log" Mar 13 15:21:29 crc kubenswrapper[4898]: I0313 15:21:29.740566 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:21:29 crc kubenswrapper[4898]: E0313 15:21:29.742815 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:21:38 crc kubenswrapper[4898]: I0313 15:21:38.454867 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-9m8s6_b707c4ee-39e1-4fc6-812a-f61e722c1079/nmstate-console-plugin/0.log" Mar 13 15:21:38 crc kubenswrapper[4898]: I0313 15:21:38.673742 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fpgr7_e4761153-ed4e-4264-8f21-b4de31a4bbb8/nmstate-handler/0.log" Mar 13 15:21:38 crc kubenswrapper[4898]: I0313 15:21:38.753349 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-c8fgd_35105fc0-dff0-4480-8635-cbbeec82d124/nmstate-metrics/0.log" Mar 13 15:21:38 crc kubenswrapper[4898]: I0313 15:21:38.778756 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-c8fgd_35105fc0-dff0-4480-8635-cbbeec82d124/kube-rbac-proxy/0.log" Mar 13 15:21:39 crc kubenswrapper[4898]: I0313 15:21:39.321245 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-m8j8d_a9193e72-6911-4df4-8b26-04b2537f68a9/nmstate-webhook/0.log" Mar 13 15:21:39 crc kubenswrapper[4898]: I0313 15:21:39.372037 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-hmwt2_84d4e279-f74c-48fd-9514-1a697341ac6a/nmstate-operator/0.log" Mar 13 15:21:41 crc kubenswrapper[4898]: I0313 15:21:41.739824 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:21:41 crc kubenswrapper[4898]: E0313 15:21:41.740541 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:21:53 crc kubenswrapper[4898]: I0313 15:21:53.739986 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:21:53 crc kubenswrapper[4898]: E0313 15:21:53.740929 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:21:55 crc kubenswrapper[4898]: I0313 15:21:55.558507 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5fb555ff84-j52b8_2cd05b5b-32da-4560-a761-72221b99e2c6/kube-rbac-proxy/0.log" Mar 13 15:21:55 crc kubenswrapper[4898]: I0313 15:21:55.625686 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5fb555ff84-j52b8_2cd05b5b-32da-4560-a761-72221b99e2c6/manager/0.log" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.146434 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556922-p8rbd"] Mar 13 15:22:00 crc kubenswrapper[4898]: E0313 15:22:00.148406 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6caf987f-dbe2-48d2-8138-107de40fe224" containerName="oc" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.148499 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6caf987f-dbe2-48d2-8138-107de40fe224" containerName="oc" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.148833 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6caf987f-dbe2-48d2-8138-107de40fe224" containerName="oc" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.149706 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.152777 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.152786 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.153755 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.166777 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556922-p8rbd"] Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.193867 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htlcd\" (UniqueName: \"kubernetes.io/projected/af39c392-cb6d-4afc-837c-9cbf245a9856-kube-api-access-htlcd\") pod \"auto-csr-approver-29556922-p8rbd\" (UID: \"af39c392-cb6d-4afc-837c-9cbf245a9856\") " pod="openshift-infra/auto-csr-approver-29556922-p8rbd" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.297845 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htlcd\" (UniqueName: \"kubernetes.io/projected/af39c392-cb6d-4afc-837c-9cbf245a9856-kube-api-access-htlcd\") pod \"auto-csr-approver-29556922-p8rbd\" (UID: \"af39c392-cb6d-4afc-837c-9cbf245a9856\") " pod="openshift-infra/auto-csr-approver-29556922-p8rbd" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.341591 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htlcd\" (UniqueName: \"kubernetes.io/projected/af39c392-cb6d-4afc-837c-9cbf245a9856-kube-api-access-htlcd\") pod \"auto-csr-approver-29556922-p8rbd\" (UID: \"af39c392-cb6d-4afc-837c-9cbf245a9856\") " pod="openshift-infra/auto-csr-approver-29556922-p8rbd" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.483579 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" Mar 13 15:22:01 crc kubenswrapper[4898]: I0313 15:22:01.856116 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556922-p8rbd"] Mar 13 15:22:02 crc kubenswrapper[4898]: W0313 15:22:02.140006 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf39c392_cb6d_4afc_837c_9cbf245a9856.slice/crio-d7da9596e6b5c1a4612480823cfd1e026f0f98cfef44ea0a09c535cd59f364ee WatchSource:0}: Error finding container d7da9596e6b5c1a4612480823cfd1e026f0f98cfef44ea0a09c535cd59f364ee: Status 404 returned error can't find the container with id d7da9596e6b5c1a4612480823cfd1e026f0f98cfef44ea0a09c535cd59f364ee Mar 13 15:22:02 crc kubenswrapper[4898]: I0313 15:22:02.148699 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:22:03 crc kubenswrapper[4898]: I0313 15:22:03.106443 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" event={"ID":"af39c392-cb6d-4afc-837c-9cbf245a9856","Type":"ContainerStarted","Data":"d7da9596e6b5c1a4612480823cfd1e026f0f98cfef44ea0a09c535cd59f364ee"} Mar 13 15:22:04 crc kubenswrapper[4898]: I0313 15:22:04.119950 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" event={"ID":"af39c392-cb6d-4afc-837c-9cbf245a9856","Type":"ContainerStarted","Data":"038b2fc06591bca90016e0206f4da45f7623dddf7ed529280975231bc7adf587"} Mar 13 15:22:04 crc kubenswrapper[4898]: I0313 15:22:04.140073 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" podStartSLOduration=2.801319619 podStartE2EDuration="4.140050802s" podCreationTimestamp="2026-03-13 15:22:00 +0000 UTC" firstStartedPulling="2026-03-13 15:22:02.143002625 +0000 UTC m=+5157.144590864" lastFinishedPulling="2026-03-13 15:22:03.481733808 +0000 UTC m=+5158.483322047" observedRunningTime="2026-03-13 15:22:04.135741695 +0000 UTC m=+5159.137329954" watchObservedRunningTime="2026-03-13 15:22:04.140050802 +0000 UTC m=+5159.141639061" Mar 13 15:22:05 crc kubenswrapper[4898]: I0313 15:22:05.135030 4898 generic.go:334] "Generic (PLEG): container finished" podID="af39c392-cb6d-4afc-837c-9cbf245a9856" containerID="038b2fc06591bca90016e0206f4da45f7623dddf7ed529280975231bc7adf587" exitCode=0 Mar 13 15:22:05 crc kubenswrapper[4898]: I0313 15:22:05.135124 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" event={"ID":"af39c392-cb6d-4afc-837c-9cbf245a9856","Type":"ContainerDied","Data":"038b2fc06591bca90016e0206f4da45f7623dddf7ed529280975231bc7adf587"} Mar 13 15:22:06 crc kubenswrapper[4898]: I0313 15:22:06.573887 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" Mar 13 15:22:06 crc kubenswrapper[4898]: I0313 15:22:06.654991 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htlcd\" (UniqueName: \"kubernetes.io/projected/af39c392-cb6d-4afc-837c-9cbf245a9856-kube-api-access-htlcd\") pod \"af39c392-cb6d-4afc-837c-9cbf245a9856\" (UID: \"af39c392-cb6d-4afc-837c-9cbf245a9856\") " Mar 13 15:22:06 crc kubenswrapper[4898]: I0313 15:22:06.661090 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af39c392-cb6d-4afc-837c-9cbf245a9856-kube-api-access-htlcd" (OuterVolumeSpecName: "kube-api-access-htlcd") pod "af39c392-cb6d-4afc-837c-9cbf245a9856" (UID: "af39c392-cb6d-4afc-837c-9cbf245a9856"). InnerVolumeSpecName "kube-api-access-htlcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:22:06 crc kubenswrapper[4898]: I0313 15:22:06.758522 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htlcd\" (UniqueName: \"kubernetes.io/projected/af39c392-cb6d-4afc-837c-9cbf245a9856-kube-api-access-htlcd\") on node \"crc\" DevicePath \"\"" Mar 13 15:22:07 crc kubenswrapper[4898]: I0313 15:22:07.674238 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" event={"ID":"af39c392-cb6d-4afc-837c-9cbf245a9856","Type":"ContainerDied","Data":"d7da9596e6b5c1a4612480823cfd1e026f0f98cfef44ea0a09c535cd59f364ee"} Mar 13 15:22:07 crc kubenswrapper[4898]: I0313 15:22:07.674278 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7da9596e6b5c1a4612480823cfd1e026f0f98cfef44ea0a09c535cd59f364ee" Mar 13 15:22:07 crc kubenswrapper[4898]: I0313 15:22:07.674344 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" Mar 13 15:22:07 crc kubenswrapper[4898]: I0313 15:22:07.708416 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556916-dvhfq"] Mar 13 15:22:07 crc kubenswrapper[4898]: I0313 15:22:07.721667 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556916-dvhfq"] Mar 13 15:22:07 crc kubenswrapper[4898]: I0313 15:22:07.740271 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:22:07 crc kubenswrapper[4898]: E0313 15:22:07.740682 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:22:07 crc kubenswrapper[4898]: I0313 15:22:07.757159 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb6b061a-b0db-4b84-bfc7-08238f699132" path="/var/lib/kubelet/pods/bb6b061a-b0db-4b84-bfc7-08238f699132/volumes" Mar 13 15:22:07 crc kubenswrapper[4898]: E0313 15:22:07.900985 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf39c392_cb6d_4afc_837c_9cbf245a9856.slice\": RecentStats: unable to find data in memory cache]" Mar 13 15:22:14 crc kubenswrapper[4898]: I0313 15:22:14.806691 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-5r9gm_30c06063-b926-4f2e-b8d1-8c530cc5b0a9/prometheus-operator/0.log" Mar 13 15:22:14 crc kubenswrapper[4898]: I0313 15:22:14.921929 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_951cfcfc-3a8c-410e-a3f5-f5caa10511f5/prometheus-operator-admission-webhook/0.log" Mar 13 15:22:15 crc kubenswrapper[4898]: I0313 15:22:15.010197 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_8c190eee-747b-4a45-905c-fa0235080305/prometheus-operator-admission-webhook/0.log" Mar 13 15:22:15 crc kubenswrapper[4898]: I0313 15:22:15.979173 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-ljrtz_3bfc0332-bb59-42bf-bb70-462efa225c81/operator/1.log" Mar 13 15:22:16 crc kubenswrapper[4898]: I0313 15:22:16.100349 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-ljrtz_3bfc0332-bb59-42bf-bb70-462efa225c81/operator/0.log" Mar 13 15:22:16 crc kubenswrapper[4898]: I0313 15:22:16.109848 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-gpj8b_ad052248-8fcd-4ef6-9969-5023b87bbbf9/observability-ui-dashboards/0.log" Mar 13 15:22:16 crc kubenswrapper[4898]: I0313 15:22:16.284000 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-nkt76_79ead8ee-67ba-4831-b5d4-a1f128e94334/perses-operator/0.log" Mar 13 15:22:19 crc kubenswrapper[4898]: I0313 15:22:19.741539 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:22:19 crc kubenswrapper[4898]: E0313 15:22:19.742304 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:22:33 crc kubenswrapper[4898]: I0313 15:22:33.604102 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-66689c4bbf-bcn59_c5cfd1be-ede5-4678-99c5-17f232b97d81/cluster-logging-operator/0.log" Mar 13 15:22:33 crc kubenswrapper[4898]: I0313 15:22:33.743824 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:22:33 crc kubenswrapper[4898]: E0313 15:22:33.744096 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:22:34 crc kubenswrapper[4898]: I0313 15:22:34.867805 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_9c5fee8d-2246-4e34-8ddd-ce710e155d73/loki-compactor/0.log" Mar 13 15:22:34 crc kubenswrapper[4898]: I0313 15:22:34.910626 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-xcq52_824d10e9-5cdc-4dc5-b9a8-b151c779b900/collector/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.092780 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-9c6b6d984-vvj56_510657b4-32e2-4fa5-9c09-17869a295736/loki-distributor/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.147475 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-c6d797ccf-8ng9x_13ee53e6-2549-4dd8-91ac-80e4ef2c9d99/gateway/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.235445 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-c6d797ccf-8ng9x_13ee53e6-2549-4dd8-91ac-80e4ef2c9d99/opa/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.371945 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-c6d797ccf-9qh4r_077fcbe8-c497-44b4-82f9-ff8e317cbe83/opa/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.411346 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-c6d797ccf-9qh4r_077fcbe8-c497-44b4-82f9-ff8e317cbe83/gateway/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.528792 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_6a1df267-1145-4fe1-9455-57df3d043e3a/loki-index-gateway/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.685939 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_2194d847-4858-4f46-ab8b-c2d78cf5677e/loki-ingester/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.783026 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-6dcbdf8bb8-qr6bw_5e81d88f-c63b-4f0c-ba17-f1171350c28d/loki-querier/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.899476 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-ff66c4dc9-mwqzz_e519fed6-a687-4a01-a979-598e81122ad1/loki-query-frontend/0.log" Mar 13 15:22:45 crc kubenswrapper[4898]: I0313 15:22:45.748595 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:22:45 crc kubenswrapper[4898]: E0313 15:22:45.749540 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:22:49 crc kubenswrapper[4898]: I0313 15:22:49.010270 4898 scope.go:117] "RemoveContainer" containerID="bf083e1d2202ec3f40b443b0422cd0440a225764d3bb5e0e49d48d4861f197f0" Mar 13 15:22:53 crc kubenswrapper[4898]: I0313 15:22:53.601028 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-cx422_b231c7db-5056-4ec6-a64c-0aa8bdff336b/kube-rbac-proxy/0.log" Mar 13 15:22:53 crc kubenswrapper[4898]: I0313 15:22:53.738370 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-cx422_b231c7db-5056-4ec6-a64c-0aa8bdff336b/controller/0.log" Mar 13 15:22:53 crc kubenswrapper[4898]: I0313 15:22:53.849621 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-frr-files/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.061782 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-reloader/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.067747 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-frr-files/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.069634 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-reloader/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.110916 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-metrics/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.250388 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-reloader/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.274330 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-frr-files/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.281826 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-metrics/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.309497 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-metrics/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.445847 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-frr-files/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.481316 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-reloader/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.507214 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/controller/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.532408 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-metrics/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.737132 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/frr/1.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.792173 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/frr-metrics/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.840780 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/kube-rbac-proxy/0.log" Mar 13 15:22:55 crc kubenswrapper[4898]: I0313 15:22:55.032087 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/kube-rbac-proxy-frr/0.log" Mar 13 15:22:55 crc kubenswrapper[4898]: I0313 15:22:55.144826 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/reloader/0.log" Mar 13 15:22:55 crc kubenswrapper[4898]: I0313 15:22:55.286510 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-5p4w5_604b9c21-3e85-4c2e-9faf-962f44236911/frr-k8s-webhook-server/0.log" Mar 13 15:22:55 crc kubenswrapper[4898]: I0313 15:22:55.410729 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-cf7c75c99-qxdbx_e000d86e-e7a8-49ed-9184-fdd67dfe797d/manager/1.log" Mar 13 15:22:55 crc kubenswrapper[4898]: I0313 15:22:55.515420 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-cf7c75c99-qxdbx_e000d86e-e7a8-49ed-9184-fdd67dfe797d/manager/0.log" Mar 13 15:22:55 crc kubenswrapper[4898]: I0313 15:22:55.692830 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-67c6f6c5cb-d26qw_34b4f98c-a87c-4a97-9ac4-286afeb9e4bc/webhook-server/0.log" Mar 13 15:22:55 crc kubenswrapper[4898]: I0313 15:22:55.873587 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-g5gqr_edfd91ee-1246-43b2-84a0-95ea069de402/kube-rbac-proxy/0.log" Mar 13 15:22:56 crc kubenswrapper[4898]: I0313 15:22:56.638809 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-g5gqr_edfd91ee-1246-43b2-84a0-95ea069de402/speaker/0.log" Mar 13 15:22:56 crc kubenswrapper[4898]: I0313 15:22:56.739659 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:22:56 crc kubenswrapper[4898]: E0313 15:22:56.740022 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:22:56 crc kubenswrapper[4898]: I0313 15:22:56.858002 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/frr/0.log" Mar 13 15:23:10 crc kubenswrapper[4898]: I0313 15:23:10.740366 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:23:10 crc kubenswrapper[4898]: E0313 15:23:10.741147 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:23:11 crc kubenswrapper[4898]: I0313 15:23:11.982746 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh_53800f20-93f5-4ab5-9feb-eb325fa0f945/util/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.166870 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh_53800f20-93f5-4ab5-9feb-eb325fa0f945/util/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.189580 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh_53800f20-93f5-4ab5-9feb-eb325fa0f945/pull/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.258210 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh_53800f20-93f5-4ab5-9feb-eb325fa0f945/pull/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.439511 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh_53800f20-93f5-4ab5-9feb-eb325fa0f945/util/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.441474 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh_53800f20-93f5-4ab5-9feb-eb325fa0f945/pull/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.469939 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh_53800f20-93f5-4ab5-9feb-eb325fa0f945/extract/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.631126 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8_53fae31e-a97e-443d-88c2-fa38af842855/util/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.813067 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8_53fae31e-a97e-443d-88c2-fa38af842855/util/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.837327 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8_53fae31e-a97e-443d-88c2-fa38af842855/pull/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.838930 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8_53fae31e-a97e-443d-88c2-fa38af842855/pull/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.017455 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8_53fae31e-a97e-443d-88c2-fa38af842855/pull/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.030914 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8_53fae31e-a97e-443d-88c2-fa38af842855/extract/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.057259 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8_53fae31e-a97e-443d-88c2-fa38af842855/util/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.225739 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls_964a321b-4be6-444e-8c20-3fc586008da7/util/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.362890 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls_964a321b-4be6-444e-8c20-3fc586008da7/util/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.375443 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls_964a321b-4be6-444e-8c20-3fc586008da7/pull/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.398579 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls_964a321b-4be6-444e-8c20-3fc586008da7/pull/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.582393 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls_964a321b-4be6-444e-8c20-3fc586008da7/util/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.586778 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls_964a321b-4be6-444e-8c20-3fc586008da7/pull/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.632267 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls_964a321b-4be6-444e-8c20-3fc586008da7/extract/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.770971 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q_e0f45cf5-8d8f-472b-87f5-64e5c8192622/util/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.960535 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q_e0f45cf5-8d8f-472b-87f5-64e5c8192622/util/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.999178 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q_e0f45cf5-8d8f-472b-87f5-64e5c8192622/pull/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.026495 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q_e0f45cf5-8d8f-472b-87f5-64e5c8192622/pull/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.211090 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q_e0f45cf5-8d8f-472b-87f5-64e5c8192622/util/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.297606 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q_e0f45cf5-8d8f-472b-87f5-64e5c8192622/pull/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.345293 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q_e0f45cf5-8d8f-472b-87f5-64e5c8192622/extract/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.574627 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4_dd46f989-e694-47a9-9b46-e96b7b47e403/util/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.721624 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4_dd46f989-e694-47a9-9b46-e96b7b47e403/util/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.726312 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4_dd46f989-e694-47a9-9b46-e96b7b47e403/pull/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.727170 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4_dd46f989-e694-47a9-9b46-e96b7b47e403/pull/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.941101 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4_dd46f989-e694-47a9-9b46-e96b7b47e403/extract/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.944509 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4_dd46f989-e694-47a9-9b46-e96b7b47e403/pull/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.952617 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4_dd46f989-e694-47a9-9b46-e96b7b47e403/util/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.160166 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hkbng_89abe4ad-dd62-4a70-a1d1-fdf97448ada5/extract-utilities/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.327952 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hkbng_89abe4ad-dd62-4a70-a1d1-fdf97448ada5/extract-utilities/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.354996 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hkbng_89abe4ad-dd62-4a70-a1d1-fdf97448ada5/extract-content/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.355070 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hkbng_89abe4ad-dd62-4a70-a1d1-fdf97448ada5/extract-content/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.578476 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hkbng_89abe4ad-dd62-4a70-a1d1-fdf97448ada5/extract-content/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.596597 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hkbng_89abe4ad-dd62-4a70-a1d1-fdf97448ada5/extract-utilities/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.660109 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcfmz_2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e/extract-utilities/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.873836 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcfmz_2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e/extract-utilities/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.882787 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcfmz_2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e/extract-content/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.888156 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcfmz_2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e/extract-content/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.177496 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcfmz_2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e/extract-content/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.233566 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcfmz_2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e/extract-utilities/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.491193 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-z7ng7_b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320/marketplace-operator/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.567228 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zs42q_0182307e-bc7f-415e-a0f9-0eff9902384c/extract-utilities/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.720264 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hkbng_89abe4ad-dd62-4a70-a1d1-fdf97448ada5/registry-server/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.765944 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zs42q_0182307e-bc7f-415e-a0f9-0eff9902384c/extract-utilities/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.825257 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zs42q_0182307e-bc7f-415e-a0f9-0eff9902384c/extract-content/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.826619 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zs42q_0182307e-bc7f-415e-a0f9-0eff9902384c/extract-content/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.872525 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcfmz_2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e/registry-server/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.025424 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zs42q_0182307e-bc7f-415e-a0f9-0eff9902384c/extract-utilities/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.032181 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zs42q_0182307e-bc7f-415e-a0f9-0eff9902384c/extract-content/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.087527 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgjzn_cbb51f06-0778-4b18-82b5-c5ce91e0a613/extract-utilities/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.266358 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgjzn_cbb51f06-0778-4b18-82b5-c5ce91e0a613/extract-utilities/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.300765 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgjzn_cbb51f06-0778-4b18-82b5-c5ce91e0a613/extract-content/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.303328 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zs42q_0182307e-bc7f-415e-a0f9-0eff9902384c/registry-server/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.334304 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgjzn_cbb51f06-0778-4b18-82b5-c5ce91e0a613/extract-content/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.584261 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgjzn_cbb51f06-0778-4b18-82b5-c5ce91e0a613/extract-utilities/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.608686 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgjzn_cbb51f06-0778-4b18-82b5-c5ce91e0a613/extract-content/0.log" Mar 13 15:23:18 crc kubenswrapper[4898]: I0313 15:23:18.380572 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgjzn_cbb51f06-0778-4b18-82b5-c5ce91e0a613/registry-server/0.log" Mar 13 15:23:22 crc kubenswrapper[4898]: I0313 15:23:22.740720 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:23:22 crc kubenswrapper[4898]: E0313 15:23:22.743107 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:23:31 crc kubenswrapper[4898]: I0313 15:23:31.833073 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_951cfcfc-3a8c-410e-a3f5-f5caa10511f5/prometheus-operator-admission-webhook/0.log" Mar 13 15:23:31 crc kubenswrapper[4898]: I0313 15:23:31.842457 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-5r9gm_30c06063-b926-4f2e-b8d1-8c530cc5b0a9/prometheus-operator/0.log" Mar 13 15:23:31 crc kubenswrapper[4898]: I0313 15:23:31.877708 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_8c190eee-747b-4a45-905c-fa0235080305/prometheus-operator-admission-webhook/0.log" Mar 13 15:23:32 crc kubenswrapper[4898]: I0313 15:23:32.036989 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-ljrtz_3bfc0332-bb59-42bf-bb70-462efa225c81/operator/1.log" Mar 13 15:23:32 crc kubenswrapper[4898]: I0313 15:23:32.088701 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-ljrtz_3bfc0332-bb59-42bf-bb70-462efa225c81/operator/0.log" Mar 13 15:23:32 crc kubenswrapper[4898]: I0313 15:23:32.100105 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-gpj8b_ad052248-8fcd-4ef6-9969-5023b87bbbf9/observability-ui-dashboards/0.log" Mar 13 15:23:32 crc kubenswrapper[4898]: I0313 15:23:32.150604 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-nkt76_79ead8ee-67ba-4831-b5d4-a1f128e94334/perses-operator/0.log" Mar 13 15:23:36 crc kubenswrapper[4898]: I0313 15:23:36.740101 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:23:36 crc kubenswrapper[4898]: E0313 15:23:36.741187 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:23:47 crc kubenswrapper[4898]: I0313 15:23:47.021875 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5fb555ff84-j52b8_2cd05b5b-32da-4560-a761-72221b99e2c6/kube-rbac-proxy/0.log" Mar 13 15:23:47 crc kubenswrapper[4898]: I0313 15:23:47.097197 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5fb555ff84-j52b8_2cd05b5b-32da-4560-a761-72221b99e2c6/manager/0.log" Mar 13 15:23:51 crc kubenswrapper[4898]: I0313 15:23:51.740081 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:23:51 crc kubenswrapper[4898]: E0313 15:23:51.740978 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.158566 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556924-vg49w"] Mar 13 15:24:00 crc kubenswrapper[4898]: E0313 15:24:00.159629 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af39c392-cb6d-4afc-837c-9cbf245a9856" containerName="oc" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.159645 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="af39c392-cb6d-4afc-837c-9cbf245a9856" containerName="oc" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.159871 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="af39c392-cb6d-4afc-837c-9cbf245a9856" containerName="oc" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.161222 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556924-vg49w" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.164745 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.165042 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.165215 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.175275 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556924-vg49w"] Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.296594 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh57t\" (UniqueName: \"kubernetes.io/projected/3b1a5763-109f-4888-97bb-eeb7cd25ff69-kube-api-access-sh57t\") pod \"auto-csr-approver-29556924-vg49w\" (UID: \"3b1a5763-109f-4888-97bb-eeb7cd25ff69\") " pod="openshift-infra/auto-csr-approver-29556924-vg49w" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.398733 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh57t\" (UniqueName: \"kubernetes.io/projected/3b1a5763-109f-4888-97bb-eeb7cd25ff69-kube-api-access-sh57t\") pod \"auto-csr-approver-29556924-vg49w\" (UID: \"3b1a5763-109f-4888-97bb-eeb7cd25ff69\") " pod="openshift-infra/auto-csr-approver-29556924-vg49w" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.448435 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh57t\" (UniqueName: \"kubernetes.io/projected/3b1a5763-109f-4888-97bb-eeb7cd25ff69-kube-api-access-sh57t\") pod \"auto-csr-approver-29556924-vg49w\" (UID: \"3b1a5763-109f-4888-97bb-eeb7cd25ff69\") " pod="openshift-infra/auto-csr-approver-29556924-vg49w" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.486918 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556924-vg49w" Mar 13 15:24:01 crc kubenswrapper[4898]: I0313 15:24:01.305434 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556924-vg49w"] Mar 13 15:24:01 crc kubenswrapper[4898]: W0313 15:24:01.310505 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b1a5763_109f_4888_97bb_eeb7cd25ff69.slice/crio-ee9adcc76a22529be0fdb51986d191444c5cef25a069c6809a38672b448765c3 WatchSource:0}: Error finding container ee9adcc76a22529be0fdb51986d191444c5cef25a069c6809a38672b448765c3: Status 404 returned error can't find the container with id ee9adcc76a22529be0fdb51986d191444c5cef25a069c6809a38672b448765c3 Mar 13 15:24:02 crc kubenswrapper[4898]: I0313 15:24:02.039046 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556924-vg49w" event={"ID":"3b1a5763-109f-4888-97bb-eeb7cd25ff69","Type":"ContainerStarted","Data":"ee9adcc76a22529be0fdb51986d191444c5cef25a069c6809a38672b448765c3"} Mar 13 15:24:03 crc kubenswrapper[4898]: I0313 15:24:03.741170 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:24:03 crc kubenswrapper[4898]: E0313 15:24:03.742099 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:24:04 crc kubenswrapper[4898]: I0313 15:24:04.064764 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556924-vg49w" event={"ID":"3b1a5763-109f-4888-97bb-eeb7cd25ff69","Type":"ContainerStarted","Data":"c8ba8026ba786ec97dc1c956429b165e911caa22ae98facccf3eabf821d09223"} Mar 13 15:24:04 crc kubenswrapper[4898]: I0313 15:24:04.082468 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556924-vg49w" podStartSLOduration=2.935648285 podStartE2EDuration="4.082443859s" podCreationTimestamp="2026-03-13 15:24:00 +0000 UTC" firstStartedPulling="2026-03-13 15:24:01.314270087 +0000 UTC m=+5276.315858326" lastFinishedPulling="2026-03-13 15:24:02.461065661 +0000 UTC m=+5277.462653900" observedRunningTime="2026-03-13 15:24:04.07768723 +0000 UTC m=+5279.079275479" watchObservedRunningTime="2026-03-13 15:24:04.082443859 +0000 UTC m=+5279.084032088" Mar 13 15:24:07 crc kubenswrapper[4898]: I0313 15:24:07.108726 4898 generic.go:334] "Generic (PLEG): container finished" podID="3b1a5763-109f-4888-97bb-eeb7cd25ff69" containerID="c8ba8026ba786ec97dc1c956429b165e911caa22ae98facccf3eabf821d09223" exitCode=0 Mar 13 15:24:07 crc kubenswrapper[4898]: I0313 15:24:07.109239 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556924-vg49w" event={"ID":"3b1a5763-109f-4888-97bb-eeb7cd25ff69","Type":"ContainerDied","Data":"c8ba8026ba786ec97dc1c956429b165e911caa22ae98facccf3eabf821d09223"} Mar 13 15:24:08 crc kubenswrapper[4898]: I0313 15:24:08.591294 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556924-vg49w" Mar 13 15:24:08 crc kubenswrapper[4898]: I0313 15:24:08.689067 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh57t\" (UniqueName: \"kubernetes.io/projected/3b1a5763-109f-4888-97bb-eeb7cd25ff69-kube-api-access-sh57t\") pod \"3b1a5763-109f-4888-97bb-eeb7cd25ff69\" (UID: \"3b1a5763-109f-4888-97bb-eeb7cd25ff69\") " Mar 13 15:24:08 crc kubenswrapper[4898]: I0313 15:24:08.717995 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b1a5763-109f-4888-97bb-eeb7cd25ff69-kube-api-access-sh57t" (OuterVolumeSpecName: "kube-api-access-sh57t") pod "3b1a5763-109f-4888-97bb-eeb7cd25ff69" (UID: "3b1a5763-109f-4888-97bb-eeb7cd25ff69"). InnerVolumeSpecName "kube-api-access-sh57t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:08 crc kubenswrapper[4898]: I0313 15:24:08.791736 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh57t\" (UniqueName: \"kubernetes.io/projected/3b1a5763-109f-4888-97bb-eeb7cd25ff69-kube-api-access-sh57t\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:09 crc kubenswrapper[4898]: I0313 15:24:09.136403 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556924-vg49w" event={"ID":"3b1a5763-109f-4888-97bb-eeb7cd25ff69","Type":"ContainerDied","Data":"ee9adcc76a22529be0fdb51986d191444c5cef25a069c6809a38672b448765c3"} Mar 13 15:24:09 crc kubenswrapper[4898]: I0313 15:24:09.136453 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee9adcc76a22529be0fdb51986d191444c5cef25a069c6809a38672b448765c3" Mar 13 15:24:09 crc kubenswrapper[4898]: I0313 15:24:09.136453 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556924-vg49w" Mar 13 15:24:09 crc kubenswrapper[4898]: I0313 15:24:09.222687 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556918-gcw8c"] Mar 13 15:24:09 crc kubenswrapper[4898]: I0313 15:24:09.238218 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556918-gcw8c"] Mar 13 15:24:09 crc kubenswrapper[4898]: I0313 15:24:09.756714 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85e19347-9341-49c0-9195-97e383796cb3" path="/var/lib/kubelet/pods/85e19347-9341-49c0-9195-97e383796cb3/volumes" Mar 13 15:24:16 crc kubenswrapper[4898]: I0313 15:24:16.739645 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:24:16 crc kubenswrapper[4898]: E0313 15:24:16.740548 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:24:29 crc kubenswrapper[4898]: I0313 15:24:29.739443 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:24:29 crc kubenswrapper[4898]: E0313 15:24:29.740219 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:24:42 crc kubenswrapper[4898]: I0313 15:24:42.740296 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:24:42 crc kubenswrapper[4898]: E0313 15:24:42.741216 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:24:49 crc kubenswrapper[4898]: I0313 15:24:49.197955 4898 scope.go:117] "RemoveContainer" containerID="41c4a045c361afa3d6c1c9f58f41c2c132e20bbf6ef35d1cfbb029e2412c57e3" Mar 13 15:24:49 crc kubenswrapper[4898]: I0313 15:24:49.271949 4898 scope.go:117] "RemoveContainer" containerID="2124311011b6d2dbdb56bd6bbe60e2fefb84036c80bad5c23f6cbdc48089d7e2" Mar 13 15:24:54 crc kubenswrapper[4898]: I0313 15:24:54.740342 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:24:54 crc kubenswrapper[4898]: E0313 15:24:54.741966 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:25:09 crc kubenswrapper[4898]: I0313 15:25:09.740777 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:25:09 crc kubenswrapper[4898]: E0313 15:25:09.741645 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:25:23 crc kubenswrapper[4898]: I0313 15:25:23.740034 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:25:23 crc kubenswrapper[4898]: E0313 15:25:23.740982 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:25:34 crc kubenswrapper[4898]: I0313 15:25:34.740599 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:25:34 crc kubenswrapper[4898]: E0313 15:25:34.741886 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.465544 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gn5g7"] Mar 13 15:25:41 crc kubenswrapper[4898]: E0313 15:25:41.467008 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1a5763-109f-4888-97bb-eeb7cd25ff69" containerName="oc" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.467037 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1a5763-109f-4888-97bb-eeb7cd25ff69" containerName="oc" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.468405 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b1a5763-109f-4888-97bb-eeb7cd25ff69" containerName="oc" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.476557 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.483603 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gn5g7"] Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.555295 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-utilities\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.555416 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqvdw\" (UniqueName: \"kubernetes.io/projected/d9c51c71-efe8-43ff-88ed-fb12b1121692-kube-api-access-qqvdw\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.555808 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-catalog-content\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.658084 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-catalog-content\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.658269 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-utilities\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.658311 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqvdw\" (UniqueName: \"kubernetes.io/projected/d9c51c71-efe8-43ff-88ed-fb12b1121692-kube-api-access-qqvdw\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.658772 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-utilities\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.658773 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-catalog-content\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.688200 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqvdw\" (UniqueName: \"kubernetes.io/projected/d9c51c71-efe8-43ff-88ed-fb12b1121692-kube-api-access-qqvdw\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.813227 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:42 crc kubenswrapper[4898]: I0313 15:25:42.296608 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gn5g7"] Mar 13 15:25:42 crc kubenswrapper[4898]: W0313 15:25:42.301485 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9c51c71_efe8_43ff_88ed_fb12b1121692.slice/crio-fb24a79a6bf951be31d041d3c4d4e9c588aa8c8a21fd22b6d401f9f1916f5390 WatchSource:0}: Error finding container fb24a79a6bf951be31d041d3c4d4e9c588aa8c8a21fd22b6d401f9f1916f5390: Status 404 returned error can't find the container with id fb24a79a6bf951be31d041d3c4d4e9c588aa8c8a21fd22b6d401f9f1916f5390 Mar 13 15:25:42 crc kubenswrapper[4898]: I0313 15:25:42.359867 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5g7" event={"ID":"d9c51c71-efe8-43ff-88ed-fb12b1121692","Type":"ContainerStarted","Data":"fb24a79a6bf951be31d041d3c4d4e9c588aa8c8a21fd22b6d401f9f1916f5390"} Mar 13 15:25:43 crc kubenswrapper[4898]: I0313 15:25:43.375782 4898 generic.go:334] "Generic (PLEG): container finished" podID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerID="e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7" exitCode=0 Mar 13 15:25:43 crc kubenswrapper[4898]: I0313 15:25:43.375839 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5g7" event={"ID":"d9c51c71-efe8-43ff-88ed-fb12b1121692","Type":"ContainerDied","Data":"e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7"} Mar 13 15:25:45 crc kubenswrapper[4898]: I0313 15:25:45.409345 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5g7" event={"ID":"d9c51c71-efe8-43ff-88ed-fb12b1121692","Type":"ContainerStarted","Data":"549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7"} Mar 13 15:25:46 crc kubenswrapper[4898]: I0313 15:25:46.424011 4898 generic.go:334] "Generic (PLEG): container finished" podID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerID="549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7" exitCode=0 Mar 13 15:25:46 crc kubenswrapper[4898]: I0313 15:25:46.424108 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5g7" event={"ID":"d9c51c71-efe8-43ff-88ed-fb12b1121692","Type":"ContainerDied","Data":"549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7"} Mar 13 15:25:47 crc kubenswrapper[4898]: I0313 15:25:47.442047 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5g7" event={"ID":"d9c51c71-efe8-43ff-88ed-fb12b1121692","Type":"ContainerStarted","Data":"4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e"} Mar 13 15:25:47 crc kubenswrapper[4898]: I0313 15:25:47.478813 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gn5g7" podStartSLOduration=3.023432986 podStartE2EDuration="6.478787395s" podCreationTimestamp="2026-03-13 15:25:41 +0000 UTC" firstStartedPulling="2026-03-13 15:25:43.378665299 +0000 UTC m=+5378.380253538" lastFinishedPulling="2026-03-13 15:25:46.834019708 +0000 UTC m=+5381.835607947" observedRunningTime="2026-03-13 15:25:47.472046156 +0000 UTC m=+5382.473634405" watchObservedRunningTime="2026-03-13 15:25:47.478787395 +0000 UTC m=+5382.480375644" Mar 13 15:25:49 crc kubenswrapper[4898]: I0313 15:25:49.410739 4898 scope.go:117] "RemoveContainer" containerID="16ab5a558dadc1a752795f9720b3efadfbd0fc9afffa700ac13e16901fd9b881" Mar 13 15:25:49 crc kubenswrapper[4898]: I0313 15:25:49.739408 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:25:50 crc kubenswrapper[4898]: I0313 15:25:50.505667 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"3f1eb02ceee77301060a512e8cdb70aa1cab3b74525898dd1df91250d09e1006"} Mar 13 15:25:51 crc kubenswrapper[4898]: I0313 15:25:51.522286 4898 generic.go:334] "Generic (PLEG): container finished" podID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerID="2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83" exitCode=0 Mar 13 15:25:51 crc kubenswrapper[4898]: I0313 15:25:51.522378 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/must-gather-cklv9" event={"ID":"9ef69d80-7edf-459b-a521-b45bc90a18df","Type":"ContainerDied","Data":"2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83"} Mar 13 15:25:51 crc kubenswrapper[4898]: I0313 15:25:51.523708 4898 scope.go:117] "RemoveContainer" containerID="2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83" Mar 13 15:25:51 crc kubenswrapper[4898]: I0313 15:25:51.814138 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:51 crc kubenswrapper[4898]: I0313 15:25:51.814232 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:51 crc kubenswrapper[4898]: I0313 15:25:51.885757 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.362601 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b6mrl_must-gather-cklv9_9ef69d80-7edf-459b-a521-b45bc90a18df/gather/0.log" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.424631 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pls2z"] Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.427210 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.445755 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pls2z"] Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.576754 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-utilities\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.576804 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-catalog-content\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.576870 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7zxr\" (UniqueName: \"kubernetes.io/projected/5b7976f2-bd4a-4ee5-983a-d1a70216276a-kube-api-access-k7zxr\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.599465 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.679361 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-utilities\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.679413 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-catalog-content\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.679505 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7zxr\" (UniqueName: \"kubernetes.io/projected/5b7976f2-bd4a-4ee5-983a-d1a70216276a-kube-api-access-k7zxr\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.680020 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-utilities\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.680083 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-catalog-content\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.701835 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7zxr\" (UniqueName: \"kubernetes.io/projected/5b7976f2-bd4a-4ee5-983a-d1a70216276a-kube-api-access-k7zxr\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.756135 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:53 crc kubenswrapper[4898]: I0313 15:25:53.334031 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pls2z"] Mar 13 15:25:53 crc kubenswrapper[4898]: I0313 15:25:53.547579 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pls2z" event={"ID":"5b7976f2-bd4a-4ee5-983a-d1a70216276a","Type":"ContainerStarted","Data":"816344487fa42199348e57ae5e56299ab0a59791c53b5816fd0a0d3e54766c35"} Mar 13 15:25:54 crc kubenswrapper[4898]: I0313 15:25:54.562477 4898 generic.go:334] "Generic (PLEG): container finished" podID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerID="3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a" exitCode=0 Mar 13 15:25:54 crc kubenswrapper[4898]: I0313 15:25:54.563181 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pls2z" event={"ID":"5b7976f2-bd4a-4ee5-983a-d1a70216276a","Type":"ContainerDied","Data":"3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a"} Mar 13 15:25:54 crc kubenswrapper[4898]: I0313 15:25:54.926024 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gn5g7"] Mar 13 15:25:54 crc kubenswrapper[4898]: I0313 15:25:54.926274 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gn5g7" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerName="registry-server" containerID="cri-o://4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e" gracePeriod=2 Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.486224 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.583756 4898 generic.go:334] "Generic (PLEG): container finished" podID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerID="4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e" exitCode=0 Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.583867 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5g7" event={"ID":"d9c51c71-efe8-43ff-88ed-fb12b1121692","Type":"ContainerDied","Data":"4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e"} Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.583977 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.584076 4898 scope.go:117] "RemoveContainer" containerID="4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.584061 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5g7" event={"ID":"d9c51c71-efe8-43ff-88ed-fb12b1121692","Type":"ContainerDied","Data":"fb24a79a6bf951be31d041d3c4d4e9c588aa8c8a21fd22b6d401f9f1916f5390"} Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.605252 4898 scope.go:117] "RemoveContainer" containerID="549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.645155 4898 scope.go:117] "RemoveContainer" containerID="e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.660202 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-utilities\") pod \"d9c51c71-efe8-43ff-88ed-fb12b1121692\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.660367 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-catalog-content\") pod \"d9c51c71-efe8-43ff-88ed-fb12b1121692\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.660425 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqvdw\" (UniqueName: \"kubernetes.io/projected/d9c51c71-efe8-43ff-88ed-fb12b1121692-kube-api-access-qqvdw\") pod \"d9c51c71-efe8-43ff-88ed-fb12b1121692\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.661229 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-utilities" (OuterVolumeSpecName: "utilities") pod "d9c51c71-efe8-43ff-88ed-fb12b1121692" (UID: "d9c51c71-efe8-43ff-88ed-fb12b1121692"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.666920 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9c51c71-efe8-43ff-88ed-fb12b1121692-kube-api-access-qqvdw" (OuterVolumeSpecName: "kube-api-access-qqvdw") pod "d9c51c71-efe8-43ff-88ed-fb12b1121692" (UID: "d9c51c71-efe8-43ff-88ed-fb12b1121692"). InnerVolumeSpecName "kube-api-access-qqvdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.726196 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9c51c71-efe8-43ff-88ed-fb12b1121692" (UID: "d9c51c71-efe8-43ff-88ed-fb12b1121692"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.761662 4898 scope.go:117] "RemoveContainer" containerID="4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e" Mar 13 15:25:55 crc kubenswrapper[4898]: E0313 15:25:55.762205 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e\": container with ID starting with 4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e not found: ID does not exist" containerID="4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.762247 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e"} err="failed to get container status \"4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e\": rpc error: code = NotFound desc = could not find container \"4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e\": container with ID starting with 4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e not found: ID does not exist" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.762274 4898 scope.go:117] "RemoveContainer" containerID="549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7" Mar 13 15:25:55 crc kubenswrapper[4898]: E0313 15:25:55.762559 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7\": container with ID starting with 549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7 not found: ID does not exist" containerID="549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.762593 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7"} err="failed to get container status \"549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7\": rpc error: code = NotFound desc = could not find container \"549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7\": container with ID starting with 549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7 not found: ID does not exist" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.762613 4898 scope.go:117] "RemoveContainer" containerID="e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7" Mar 13 15:25:55 crc kubenswrapper[4898]: E0313 15:25:55.762908 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7\": container with ID starting with e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7 not found: ID does not exist" containerID="e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.762926 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7"} err="failed to get container status \"e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7\": rpc error: code = NotFound desc = could not find container \"e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7\": container with ID starting with e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7 not found: ID does not exist" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.763786 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.763824 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.763839 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqvdw\" (UniqueName: \"kubernetes.io/projected/d9c51c71-efe8-43ff-88ed-fb12b1121692-kube-api-access-qqvdw\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.923137 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gn5g7"] Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.947036 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gn5g7"] Mar 13 15:25:56 crc kubenswrapper[4898]: I0313 15:25:56.599318 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pls2z" event={"ID":"5b7976f2-bd4a-4ee5-983a-d1a70216276a","Type":"ContainerStarted","Data":"5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd"} Mar 13 15:25:57 crc kubenswrapper[4898]: I0313 15:25:57.615259 4898 generic.go:334] "Generic (PLEG): container finished" podID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerID="5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd" exitCode=0 Mar 13 15:25:57 crc kubenswrapper[4898]: I0313 15:25:57.615300 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pls2z" event={"ID":"5b7976f2-bd4a-4ee5-983a-d1a70216276a","Type":"ContainerDied","Data":"5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd"} Mar 13 15:25:57 crc kubenswrapper[4898]: I0313 15:25:57.752697 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" path="/var/lib/kubelet/pods/d9c51c71-efe8-43ff-88ed-fb12b1121692/volumes" Mar 13 15:25:59 crc kubenswrapper[4898]: I0313 15:25:59.664234 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pls2z" event={"ID":"5b7976f2-bd4a-4ee5-983a-d1a70216276a","Type":"ContainerStarted","Data":"4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a"} Mar 13 15:25:59 crc kubenswrapper[4898]: I0313 15:25:59.698205 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pls2z" podStartSLOduration=3.785012205 podStartE2EDuration="7.698180838s" podCreationTimestamp="2026-03-13 15:25:52 +0000 UTC" firstStartedPulling="2026-03-13 15:25:54.566692732 +0000 UTC m=+5389.568280971" lastFinishedPulling="2026-03-13 15:25:58.479861365 +0000 UTC m=+5393.481449604" observedRunningTime="2026-03-13 15:25:59.689915461 +0000 UTC m=+5394.691503720" watchObservedRunningTime="2026-03-13 15:25:59.698180838 +0000 UTC m=+5394.699769077" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.147036 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556926-znpr2"] Mar 13 15:26:00 crc kubenswrapper[4898]: E0313 15:26:00.147826 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerName="extract-utilities" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.147847 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerName="extract-utilities" Mar 13 15:26:00 crc kubenswrapper[4898]: E0313 15:26:00.147882 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerName="extract-content" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.147888 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerName="extract-content" Mar 13 15:26:00 crc kubenswrapper[4898]: E0313 15:26:00.147934 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerName="registry-server" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.147940 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerName="registry-server" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.148200 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerName="registry-server" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.149162 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556926-znpr2" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.153282 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.153309 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.153838 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.159257 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556926-znpr2"] Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.289347 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvlph\" (UniqueName: \"kubernetes.io/projected/fcf6d966-2758-453f-9308-fd452766462b-kube-api-access-hvlph\") pod \"auto-csr-approver-29556926-znpr2\" (UID: \"fcf6d966-2758-453f-9308-fd452766462b\") " pod="openshift-infra/auto-csr-approver-29556926-znpr2" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.393679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvlph\" (UniqueName: \"kubernetes.io/projected/fcf6d966-2758-453f-9308-fd452766462b-kube-api-access-hvlph\") pod \"auto-csr-approver-29556926-znpr2\" (UID: \"fcf6d966-2758-453f-9308-fd452766462b\") " pod="openshift-infra/auto-csr-approver-29556926-znpr2" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.421357 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvlph\" (UniqueName: \"kubernetes.io/projected/fcf6d966-2758-453f-9308-fd452766462b-kube-api-access-hvlph\") pod \"auto-csr-approver-29556926-znpr2\" (UID: \"fcf6d966-2758-453f-9308-fd452766462b\") " pod="openshift-infra/auto-csr-approver-29556926-znpr2" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.472239 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556926-znpr2" Mar 13 15:26:01 crc kubenswrapper[4898]: W0313 15:26:01.027376 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcf6d966_2758_453f_9308_fd452766462b.slice/crio-a61d000ce5c31fb2ba5ceef42b97a73083635ac5a421036e8e4ac49f91eecb08 WatchSource:0}: Error finding container a61d000ce5c31fb2ba5ceef42b97a73083635ac5a421036e8e4ac49f91eecb08: Status 404 returned error can't find the container with id a61d000ce5c31fb2ba5ceef42b97a73083635ac5a421036e8e4ac49f91eecb08 Mar 13 15:26:01 crc kubenswrapper[4898]: I0313 15:26:01.028719 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556926-znpr2"] Mar 13 15:26:01 crc kubenswrapper[4898]: I0313 15:26:01.736795 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556926-znpr2" event={"ID":"fcf6d966-2758-453f-9308-fd452766462b","Type":"ContainerStarted","Data":"a61d000ce5c31fb2ba5ceef42b97a73083635ac5a421036e8e4ac49f91eecb08"} Mar 13 15:26:02 crc kubenswrapper[4898]: I0313 15:26:02.756460 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:26:02 crc kubenswrapper[4898]: I0313 15:26:02.757776 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:26:03 crc kubenswrapper[4898]: I0313 15:26:03.759623 4898 generic.go:334] "Generic (PLEG): container finished" podID="fcf6d966-2758-453f-9308-fd452766462b" containerID="f74ca8934196198fdc7fe5e94130ffb287ea4038d28afe421add852586aae005" exitCode=0 Mar 13 15:26:03 crc kubenswrapper[4898]: I0313 15:26:03.759956 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556926-znpr2" event={"ID":"fcf6d966-2758-453f-9308-fd452766462b","Type":"ContainerDied","Data":"f74ca8934196198fdc7fe5e94130ffb287ea4038d28afe421add852586aae005"} Mar 13 15:26:03 crc kubenswrapper[4898]: I0313 15:26:03.811156 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-pls2z" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="registry-server" probeResult="failure" output=< Mar 13 15:26:03 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:26:03 crc kubenswrapper[4898]: > Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.000027 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b6mrl/must-gather-cklv9"] Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.000291 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-b6mrl/must-gather-cklv9" podUID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerName="copy" containerID="cri-o://0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb" gracePeriod=2 Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.013197 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b6mrl/must-gather-cklv9"] Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.542985 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b6mrl_must-gather-cklv9_9ef69d80-7edf-459b-a521-b45bc90a18df/copy/0.log" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.546334 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.626102 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfhwg\" (UniqueName: \"kubernetes.io/projected/9ef69d80-7edf-459b-a521-b45bc90a18df-kube-api-access-cfhwg\") pod \"9ef69d80-7edf-459b-a521-b45bc90a18df\" (UID: \"9ef69d80-7edf-459b-a521-b45bc90a18df\") " Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.626256 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ef69d80-7edf-459b-a521-b45bc90a18df-must-gather-output\") pod \"9ef69d80-7edf-459b-a521-b45bc90a18df\" (UID: \"9ef69d80-7edf-459b-a521-b45bc90a18df\") " Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.648728 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef69d80-7edf-459b-a521-b45bc90a18df-kube-api-access-cfhwg" (OuterVolumeSpecName: "kube-api-access-cfhwg") pod "9ef69d80-7edf-459b-a521-b45bc90a18df" (UID: "9ef69d80-7edf-459b-a521-b45bc90a18df"). InnerVolumeSpecName "kube-api-access-cfhwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.732912 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfhwg\" (UniqueName: \"kubernetes.io/projected/9ef69d80-7edf-459b-a521-b45bc90a18df-kube-api-access-cfhwg\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.781111 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b6mrl_must-gather-cklv9_9ef69d80-7edf-459b-a521-b45bc90a18df/copy/0.log" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.782853 4898 generic.go:334] "Generic (PLEG): container finished" podID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerID="0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb" exitCode=143 Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.783098 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.784044 4898 scope.go:117] "RemoveContainer" containerID="0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.816092 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef69d80-7edf-459b-a521-b45bc90a18df-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9ef69d80-7edf-459b-a521-b45bc90a18df" (UID: "9ef69d80-7edf-459b-a521-b45bc90a18df"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.835327 4898 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ef69d80-7edf-459b-a521-b45bc90a18df-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.872388 4898 scope.go:117] "RemoveContainer" containerID="2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.958955 4898 scope.go:117] "RemoveContainer" containerID="0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb" Mar 13 15:26:04 crc kubenswrapper[4898]: E0313 15:26:04.962317 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb\": container with ID starting with 0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb not found: ID does not exist" containerID="0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.962354 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb"} err="failed to get container status \"0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb\": rpc error: code = NotFound desc = could not find container \"0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb\": container with ID starting with 0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb not found: ID does not exist" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.962377 4898 scope.go:117] "RemoveContainer" containerID="2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83" Mar 13 15:26:04 crc kubenswrapper[4898]: E0313 15:26:04.967432 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83\": container with ID starting with 2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83 not found: ID does not exist" containerID="2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.967459 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83"} err="failed to get container status \"2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83\": rpc error: code = NotFound desc = could not find container \"2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83\": container with ID starting with 2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83 not found: ID does not exist" Mar 13 15:26:05 crc kubenswrapper[4898]: I0313 15:26:05.341181 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556926-znpr2" Mar 13 15:26:05 crc kubenswrapper[4898]: I0313 15:26:05.452028 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvlph\" (UniqueName: \"kubernetes.io/projected/fcf6d966-2758-453f-9308-fd452766462b-kube-api-access-hvlph\") pod \"fcf6d966-2758-453f-9308-fd452766462b\" (UID: \"fcf6d966-2758-453f-9308-fd452766462b\") " Mar 13 15:26:05 crc kubenswrapper[4898]: I0313 15:26:05.480523 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcf6d966-2758-453f-9308-fd452766462b-kube-api-access-hvlph" (OuterVolumeSpecName: "kube-api-access-hvlph") pod "fcf6d966-2758-453f-9308-fd452766462b" (UID: "fcf6d966-2758-453f-9308-fd452766462b"). InnerVolumeSpecName "kube-api-access-hvlph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:26:05 crc kubenswrapper[4898]: I0313 15:26:05.555001 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvlph\" (UniqueName: \"kubernetes.io/projected/fcf6d966-2758-453f-9308-fd452766462b-kube-api-access-hvlph\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:05 crc kubenswrapper[4898]: I0313 15:26:05.768672 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef69d80-7edf-459b-a521-b45bc90a18df" path="/var/lib/kubelet/pods/9ef69d80-7edf-459b-a521-b45bc90a18df/volumes" Mar 13 15:26:05 crc kubenswrapper[4898]: I0313 15:26:05.798865 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556926-znpr2" event={"ID":"fcf6d966-2758-453f-9308-fd452766462b","Type":"ContainerDied","Data":"a61d000ce5c31fb2ba5ceef42b97a73083635ac5a421036e8e4ac49f91eecb08"} Mar 13 15:26:05 crc kubenswrapper[4898]: I0313 15:26:05.799184 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a61d000ce5c31fb2ba5ceef42b97a73083635ac5a421036e8e4ac49f91eecb08" Mar 13 15:26:05 crc kubenswrapper[4898]: I0313 15:26:05.799244 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556926-znpr2" Mar 13 15:26:06 crc kubenswrapper[4898]: I0313 15:26:06.427684 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556920-jxmxv"] Mar 13 15:26:06 crc kubenswrapper[4898]: I0313 15:26:06.441948 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556920-jxmxv"] Mar 13 15:26:07 crc kubenswrapper[4898]: I0313 15:26:07.752671 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6caf987f-dbe2-48d2-8138-107de40fe224" path="/var/lib/kubelet/pods/6caf987f-dbe2-48d2-8138-107de40fe224/volumes" Mar 13 15:26:12 crc kubenswrapper[4898]: I0313 15:26:12.806668 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:26:12 crc kubenswrapper[4898]: I0313 15:26:12.860920 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:26:13 crc kubenswrapper[4898]: I0313 15:26:13.052649 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pls2z"] Mar 13 15:26:13 crc kubenswrapper[4898]: I0313 15:26:13.886078 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pls2z" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="registry-server" containerID="cri-o://4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a" gracePeriod=2 Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.482616 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.592303 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-utilities\") pod \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.593119 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-utilities" (OuterVolumeSpecName: "utilities") pod "5b7976f2-bd4a-4ee5-983a-d1a70216276a" (UID: "5b7976f2-bd4a-4ee5-983a-d1a70216276a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.593338 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-catalog-content\") pod \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.607265 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7zxr\" (UniqueName: \"kubernetes.io/projected/5b7976f2-bd4a-4ee5-983a-d1a70216276a-kube-api-access-k7zxr\") pod \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.608339 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.618374 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b7976f2-bd4a-4ee5-983a-d1a70216276a-kube-api-access-k7zxr" (OuterVolumeSpecName: "kube-api-access-k7zxr") pod "5b7976f2-bd4a-4ee5-983a-d1a70216276a" (UID: "5b7976f2-bd4a-4ee5-983a-d1a70216276a"). InnerVolumeSpecName "kube-api-access-k7zxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.626096 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b7976f2-bd4a-4ee5-983a-d1a70216276a" (UID: "5b7976f2-bd4a-4ee5-983a-d1a70216276a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.710554 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.710807 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7zxr\" (UniqueName: \"kubernetes.io/projected/5b7976f2-bd4a-4ee5-983a-d1a70216276a-kube-api-access-k7zxr\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.910672 4898 generic.go:334] "Generic (PLEG): container finished" podID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerID="4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a" exitCode=0 Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.910723 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pls2z" event={"ID":"5b7976f2-bd4a-4ee5-983a-d1a70216276a","Type":"ContainerDied","Data":"4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a"} Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.910748 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.910754 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pls2z" event={"ID":"5b7976f2-bd4a-4ee5-983a-d1a70216276a","Type":"ContainerDied","Data":"816344487fa42199348e57ae5e56299ab0a59791c53b5816fd0a0d3e54766c35"} Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.910782 4898 scope.go:117] "RemoveContainer" containerID="4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.939851 4898 scope.go:117] "RemoveContainer" containerID="5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.957672 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pls2z"] Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.968023 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pls2z"] Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.994070 4898 scope.go:117] "RemoveContainer" containerID="3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a" Mar 13 15:26:15 crc kubenswrapper[4898]: I0313 15:26:15.049385 4898 scope.go:117] "RemoveContainer" containerID="4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a" Mar 13 15:26:15 crc kubenswrapper[4898]: E0313 15:26:15.050066 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a\": container with ID starting with 4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a not found: ID does not exist" containerID="4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a" Mar 13 15:26:15 crc kubenswrapper[4898]: I0313 15:26:15.050097 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a"} err="failed to get container status \"4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a\": rpc error: code = NotFound desc = could not find container \"4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a\": container with ID starting with 4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a not found: ID does not exist" Mar 13 15:26:15 crc kubenswrapper[4898]: I0313 15:26:15.050119 4898 scope.go:117] "RemoveContainer" containerID="5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd" Mar 13 15:26:15 crc kubenswrapper[4898]: E0313 15:26:15.050871 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd\": container with ID starting with 5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd not found: ID does not exist" containerID="5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd" Mar 13 15:26:15 crc kubenswrapper[4898]: I0313 15:26:15.050888 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd"} err="failed to get container status \"5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd\": rpc error: code = NotFound desc = could not find container \"5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd\": container with ID starting with 5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd not found: ID does not exist" Mar 13 15:26:15 crc kubenswrapper[4898]: I0313 15:26:15.050933 4898 scope.go:117] "RemoveContainer" containerID="3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a" Mar 13 15:26:15 crc kubenswrapper[4898]: E0313 15:26:15.051268 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a\": container with ID starting with 3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a not found: ID does not exist" containerID="3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a" Mar 13 15:26:15 crc kubenswrapper[4898]: I0313 15:26:15.051307 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a"} err="failed to get container status \"3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a\": rpc error: code = NotFound desc = could not find container \"3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a\": container with ID starting with 3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a not found: ID does not exist" Mar 13 15:26:15 crc kubenswrapper[4898]: I0313 15:26:15.761409 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" path="/var/lib/kubelet/pods/5b7976f2-bd4a-4ee5-983a-d1a70216276a/volumes" Mar 13 15:26:49 crc kubenswrapper[4898]: I0313 15:26:49.515662 4898 scope.go:117] "RemoveContainer" containerID="897e6256c632c42295b8912f57e8c6461493cccdecec91e50f43154fdcb913e4" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.305295 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2tg8f"] Mar 13 15:27:17 crc kubenswrapper[4898]: E0313 15:27:17.306225 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="extract-utilities" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306238 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="extract-utilities" Mar 13 15:27:17 crc kubenswrapper[4898]: E0313 15:27:17.306255 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerName="gather" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306261 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerName="gather" Mar 13 15:27:17 crc kubenswrapper[4898]: E0313 15:27:17.306276 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerName="copy" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306282 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerName="copy" Mar 13 15:27:17 crc kubenswrapper[4898]: E0313 15:27:17.306292 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf6d966-2758-453f-9308-fd452766462b" containerName="oc" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306298 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf6d966-2758-453f-9308-fd452766462b" containerName="oc" Mar 13 15:27:17 crc kubenswrapper[4898]: E0313 15:27:17.306335 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="extract-content" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306340 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="extract-content" Mar 13 15:27:17 crc kubenswrapper[4898]: E0313 15:27:17.306360 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="registry-server" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306366 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="registry-server" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306607 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="registry-server" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306625 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerName="gather" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306637 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcf6d966-2758-453f-9308-fd452766462b" containerName="oc" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306648 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerName="copy" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.308306 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.324006 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2tg8f"] Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.450599 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-utilities\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.450642 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8j8z\" (UniqueName: \"kubernetes.io/projected/7115acf9-09f0-41e8-995a-d6179b077f37-kube-api-access-x8j8z\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.450667 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-catalog-content\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.554303 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-utilities\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.554350 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8j8z\" (UniqueName: \"kubernetes.io/projected/7115acf9-09f0-41e8-995a-d6179b077f37-kube-api-access-x8j8z\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.554379 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-catalog-content\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.554787 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-utilities\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.554982 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-catalog-content\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.579194 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8j8z\" (UniqueName: \"kubernetes.io/projected/7115acf9-09f0-41e8-995a-d6179b077f37-kube-api-access-x8j8z\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.630407 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:18 crc kubenswrapper[4898]: W0313 15:27:18.075083 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7115acf9_09f0_41e8_995a_d6179b077f37.slice/crio-e72ff055139d18ced04dc4a34b3ef184c3ee1412b5fe2f74149d60e4602be818 WatchSource:0}: Error finding container e72ff055139d18ced04dc4a34b3ef184c3ee1412b5fe2f74149d60e4602be818: Status 404 returned error can't find the container with id e72ff055139d18ced04dc4a34b3ef184c3ee1412b5fe2f74149d60e4602be818 Mar 13 15:27:18 crc kubenswrapper[4898]: I0313 15:27:18.087509 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2tg8f"] Mar 13 15:27:18 crc kubenswrapper[4898]: I0313 15:27:18.739225 4898 generic.go:334] "Generic (PLEG): container finished" podID="7115acf9-09f0-41e8-995a-d6179b077f37" containerID="0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57" exitCode=0 Mar 13 15:27:18 crc kubenswrapper[4898]: I0313 15:27:18.739331 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tg8f" event={"ID":"7115acf9-09f0-41e8-995a-d6179b077f37","Type":"ContainerDied","Data":"0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57"} Mar 13 15:27:18 crc kubenswrapper[4898]: I0313 15:27:18.739566 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tg8f" event={"ID":"7115acf9-09f0-41e8-995a-d6179b077f37","Type":"ContainerStarted","Data":"e72ff055139d18ced04dc4a34b3ef184c3ee1412b5fe2f74149d60e4602be818"} Mar 13 15:27:18 crc kubenswrapper[4898]: I0313 15:27:18.741839 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:27:20 crc kubenswrapper[4898]: I0313 15:27:20.765912 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tg8f" event={"ID":"7115acf9-09f0-41e8-995a-d6179b077f37","Type":"ContainerStarted","Data":"f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188"} Mar 13 15:27:23 crc kubenswrapper[4898]: I0313 15:27:23.803783 4898 generic.go:334] "Generic (PLEG): container finished" podID="7115acf9-09f0-41e8-995a-d6179b077f37" containerID="f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188" exitCode=0 Mar 13 15:27:23 crc kubenswrapper[4898]: I0313 15:27:23.803830 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tg8f" event={"ID":"7115acf9-09f0-41e8-995a-d6179b077f37","Type":"ContainerDied","Data":"f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188"} Mar 13 15:27:24 crc kubenswrapper[4898]: I0313 15:27:24.825662 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tg8f" event={"ID":"7115acf9-09f0-41e8-995a-d6179b077f37","Type":"ContainerStarted","Data":"446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165"} Mar 13 15:27:24 crc kubenswrapper[4898]: I0313 15:27:24.865340 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2tg8f" podStartSLOduration=2.126632061 podStartE2EDuration="7.865317974s" podCreationTimestamp="2026-03-13 15:27:17 +0000 UTC" firstStartedPulling="2026-03-13 15:27:18.741632117 +0000 UTC m=+5473.743220356" lastFinishedPulling="2026-03-13 15:27:24.48031803 +0000 UTC m=+5479.481906269" observedRunningTime="2026-03-13 15:27:24.854892763 +0000 UTC m=+5479.856481032" watchObservedRunningTime="2026-03-13 15:27:24.865317974 +0000 UTC m=+5479.866906223" Mar 13 15:27:27 crc kubenswrapper[4898]: I0313 15:27:27.630893 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:27 crc kubenswrapper[4898]: I0313 15:27:27.633315 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:28 crc kubenswrapper[4898]: I0313 15:27:28.976190 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-2tg8f" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="registry-server" probeResult="failure" output=< Mar 13 15:27:28 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:27:28 crc kubenswrapper[4898]: > Mar 13 15:27:37 crc kubenswrapper[4898]: I0313 15:27:37.707493 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:37 crc kubenswrapper[4898]: I0313 15:27:37.786006 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:37 crc kubenswrapper[4898]: I0313 15:27:37.971822 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2tg8f"] Mar 13 15:27:38 crc kubenswrapper[4898]: I0313 15:27:38.996292 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2tg8f" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="registry-server" containerID="cri-o://446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165" gracePeriod=2 Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.518420 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.610782 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-utilities\") pod \"7115acf9-09f0-41e8-995a-d6179b077f37\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.611702 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8j8z\" (UniqueName: \"kubernetes.io/projected/7115acf9-09f0-41e8-995a-d6179b077f37-kube-api-access-x8j8z\") pod \"7115acf9-09f0-41e8-995a-d6179b077f37\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.611750 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-catalog-content\") pod \"7115acf9-09f0-41e8-995a-d6179b077f37\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.612746 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-utilities" (OuterVolumeSpecName: "utilities") pod "7115acf9-09f0-41e8-995a-d6179b077f37" (UID: "7115acf9-09f0-41e8-995a-d6179b077f37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.619203 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7115acf9-09f0-41e8-995a-d6179b077f37-kube-api-access-x8j8z" (OuterVolumeSpecName: "kube-api-access-x8j8z") pod "7115acf9-09f0-41e8-995a-d6179b077f37" (UID: "7115acf9-09f0-41e8-995a-d6179b077f37"). InnerVolumeSpecName "kube-api-access-x8j8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.665073 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7115acf9-09f0-41e8-995a-d6179b077f37" (UID: "7115acf9-09f0-41e8-995a-d6179b077f37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.715076 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.715604 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8j8z\" (UniqueName: \"kubernetes.io/projected/7115acf9-09f0-41e8-995a-d6179b077f37-kube-api-access-x8j8z\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.715678 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.016104 4898 generic.go:334] "Generic (PLEG): container finished" podID="7115acf9-09f0-41e8-995a-d6179b077f37" containerID="446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165" exitCode=0 Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.016199 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tg8f" event={"ID":"7115acf9-09f0-41e8-995a-d6179b077f37","Type":"ContainerDied","Data":"446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165"} Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.016253 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tg8f" event={"ID":"7115acf9-09f0-41e8-995a-d6179b077f37","Type":"ContainerDied","Data":"e72ff055139d18ced04dc4a34b3ef184c3ee1412b5fe2f74149d60e4602be818"} Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.016291 4898 scope.go:117] "RemoveContainer" containerID="446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.016432 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.047652 4898 scope.go:117] "RemoveContainer" containerID="f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.065074 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2tg8f"] Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.075038 4898 scope.go:117] "RemoveContainer" containerID="0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.087559 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2tg8f"] Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.138094 4898 scope.go:117] "RemoveContainer" containerID="446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165" Mar 13 15:27:40 crc kubenswrapper[4898]: E0313 15:27:40.139710 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165\": container with ID starting with 446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165 not found: ID does not exist" containerID="446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.139743 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165"} err="failed to get container status \"446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165\": rpc error: code = NotFound desc = could not find container \"446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165\": container with ID starting with 446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165 not found: ID does not exist" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.139783 4898 scope.go:117] "RemoveContainer" containerID="f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188" Mar 13 15:27:40 crc kubenswrapper[4898]: E0313 15:27:40.140444 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188\": container with ID starting with f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188 not found: ID does not exist" containerID="f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.140505 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188"} err="failed to get container status \"f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188\": rpc error: code = NotFound desc = could not find container \"f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188\": container with ID starting with f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188 not found: ID does not exist" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.140540 4898 scope.go:117] "RemoveContainer" containerID="0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57" Mar 13 15:27:40 crc kubenswrapper[4898]: E0313 15:27:40.140877 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57\": container with ID starting with 0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57 not found: ID does not exist" containerID="0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.140933 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57"} err="failed to get container status \"0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57\": rpc error: code = NotFound desc = could not find container \"0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57\": container with ID starting with 0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57 not found: ID does not exist" Mar 13 15:27:41 crc kubenswrapper[4898]: I0313 15:27:41.755078 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" path="/var/lib/kubelet/pods/7115acf9-09f0-41e8-995a-d6179b077f37/volumes" Mar 13 15:27:49 crc kubenswrapper[4898]: I0313 15:27:49.134978 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:27:49 crc kubenswrapper[4898]: I0313 15:27:49.135499 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.228792 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dbz9v"] Mar 13 15:27:57 crc kubenswrapper[4898]: E0313 15:27:57.229745 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="registry-server" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.229757 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="registry-server" Mar 13 15:27:57 crc kubenswrapper[4898]: E0313 15:27:57.229792 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="extract-content" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.229798 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="extract-content" Mar 13 15:27:57 crc kubenswrapper[4898]: E0313 15:27:57.229813 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="extract-utilities" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.229819 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="extract-utilities" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.230077 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="registry-server" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.231688 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.248152 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dbz9v"] Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.365831 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-catalog-content\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.366290 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7trwd\" (UniqueName: \"kubernetes.io/projected/eb8014b4-2211-4ce2-93ec-3a496a563a8c-kube-api-access-7trwd\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.366419 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-utilities\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.468574 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7trwd\" (UniqueName: \"kubernetes.io/projected/eb8014b4-2211-4ce2-93ec-3a496a563a8c-kube-api-access-7trwd\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.469015 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-utilities\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.469185 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-catalog-content\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.469619 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-utilities\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.469626 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-catalog-content\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.487279 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7trwd\" (UniqueName: \"kubernetes.io/projected/eb8014b4-2211-4ce2-93ec-3a496a563a8c-kube-api-access-7trwd\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.564516 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:58 crc kubenswrapper[4898]: I0313 15:27:58.077129 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dbz9v"] Mar 13 15:27:58 crc kubenswrapper[4898]: I0313 15:27:58.228510 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbz9v" event={"ID":"eb8014b4-2211-4ce2-93ec-3a496a563a8c","Type":"ContainerStarted","Data":"dfcf7ad7032f43188bd7aba9617ccc0074fb1fd0e3010930b63236240ebde712"} Mar 13 15:27:59 crc kubenswrapper[4898]: I0313 15:27:59.240847 4898 generic.go:334] "Generic (PLEG): container finished" podID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerID="b1bdcb936ed1f82a907dadb3f319631253f8ad7fb9f25e178be06f7c87b0893e" exitCode=0 Mar 13 15:27:59 crc kubenswrapper[4898]: I0313 15:27:59.241004 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbz9v" event={"ID":"eb8014b4-2211-4ce2-93ec-3a496a563a8c","Type":"ContainerDied","Data":"b1bdcb936ed1f82a907dadb3f319631253f8ad7fb9f25e178be06f7c87b0893e"} Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.146345 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556928-wqxh8"] Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.148524 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556928-wqxh8" Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.151301 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.152226 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.154971 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.167448 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556928-wqxh8"] Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.239501 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj97x\" (UniqueName: \"kubernetes.io/projected/9ff3a636-04a3-4ec5-8cd2-da3adf44d084-kube-api-access-sj97x\") pod \"auto-csr-approver-29556928-wqxh8\" (UID: \"9ff3a636-04a3-4ec5-8cd2-da3adf44d084\") " pod="openshift-infra/auto-csr-approver-29556928-wqxh8" Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.341535 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj97x\" (UniqueName: \"kubernetes.io/projected/9ff3a636-04a3-4ec5-8cd2-da3adf44d084-kube-api-access-sj97x\") pod \"auto-csr-approver-29556928-wqxh8\" (UID: \"9ff3a636-04a3-4ec5-8cd2-da3adf44d084\") " pod="openshift-infra/auto-csr-approver-29556928-wqxh8" Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.365727 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj97x\" (UniqueName: \"kubernetes.io/projected/9ff3a636-04a3-4ec5-8cd2-da3adf44d084-kube-api-access-sj97x\") pod \"auto-csr-approver-29556928-wqxh8\" (UID: \"9ff3a636-04a3-4ec5-8cd2-da3adf44d084\") " pod="openshift-infra/auto-csr-approver-29556928-wqxh8" Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.477176 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556928-wqxh8" Mar 13 15:28:01 crc kubenswrapper[4898]: I0313 15:28:01.028278 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556928-wqxh8"] Mar 13 15:28:01 crc kubenswrapper[4898]: W0313 15:28:01.030450 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ff3a636_04a3_4ec5_8cd2_da3adf44d084.slice/crio-08e435b1c49eb97b8b6176e1d1c11c3f78d0f623f2acc8d03a798cc850afbfef WatchSource:0}: Error finding container 08e435b1c49eb97b8b6176e1d1c11c3f78d0f623f2acc8d03a798cc850afbfef: Status 404 returned error can't find the container with id 08e435b1c49eb97b8b6176e1d1c11c3f78d0f623f2acc8d03a798cc850afbfef Mar 13 15:28:01 crc kubenswrapper[4898]: I0313 15:28:01.266024 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556928-wqxh8" event={"ID":"9ff3a636-04a3-4ec5-8cd2-da3adf44d084","Type":"ContainerStarted","Data":"08e435b1c49eb97b8b6176e1d1c11c3f78d0f623f2acc8d03a798cc850afbfef"} Mar 13 15:28:01 crc kubenswrapper[4898]: I0313 15:28:01.268552 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbz9v" event={"ID":"eb8014b4-2211-4ce2-93ec-3a496a563a8c","Type":"ContainerStarted","Data":"b15545ebc323988813650859c1417c533406bfededaf18c21507a8ec9723c2af"} Mar 13 15:28:03 crc kubenswrapper[4898]: I0313 15:28:03.295974 4898 generic.go:334] "Generic (PLEG): container finished" podID="9ff3a636-04a3-4ec5-8cd2-da3adf44d084" containerID="64afab621e156319ee19f6016510705ea0ece1835f0875cb7b018c10679eac40" exitCode=0 Mar 13 15:28:03 crc kubenswrapper[4898]: I0313 15:28:03.296379 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556928-wqxh8" event={"ID":"9ff3a636-04a3-4ec5-8cd2-da3adf44d084","Type":"ContainerDied","Data":"64afab621e156319ee19f6016510705ea0ece1835f0875cb7b018c10679eac40"} Mar 13 15:28:04 crc kubenswrapper[4898]: I0313 15:28:04.868012 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556928-wqxh8" Mar 13 15:28:04 crc kubenswrapper[4898]: I0313 15:28:04.971768 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj97x\" (UniqueName: \"kubernetes.io/projected/9ff3a636-04a3-4ec5-8cd2-da3adf44d084-kube-api-access-sj97x\") pod \"9ff3a636-04a3-4ec5-8cd2-da3adf44d084\" (UID: \"9ff3a636-04a3-4ec5-8cd2-da3adf44d084\") " Mar 13 15:28:04 crc kubenswrapper[4898]: I0313 15:28:04.980295 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff3a636-04a3-4ec5-8cd2-da3adf44d084-kube-api-access-sj97x" (OuterVolumeSpecName: "kube-api-access-sj97x") pod "9ff3a636-04a3-4ec5-8cd2-da3adf44d084" (UID: "9ff3a636-04a3-4ec5-8cd2-da3adf44d084"). InnerVolumeSpecName "kube-api-access-sj97x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:05 crc kubenswrapper[4898]: I0313 15:28:05.075569 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj97x\" (UniqueName: \"kubernetes.io/projected/9ff3a636-04a3-4ec5-8cd2-da3adf44d084-kube-api-access-sj97x\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:05 crc kubenswrapper[4898]: I0313 15:28:05.335132 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556928-wqxh8" event={"ID":"9ff3a636-04a3-4ec5-8cd2-da3adf44d084","Type":"ContainerDied","Data":"08e435b1c49eb97b8b6176e1d1c11c3f78d0f623f2acc8d03a798cc850afbfef"} Mar 13 15:28:05 crc kubenswrapper[4898]: I0313 15:28:05.335170 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08e435b1c49eb97b8b6176e1d1c11c3f78d0f623f2acc8d03a798cc850afbfef" Mar 13 15:28:05 crc kubenswrapper[4898]: I0313 15:28:05.335205 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556928-wqxh8" Mar 13 15:28:05 crc kubenswrapper[4898]: I0313 15:28:05.943165 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556922-p8rbd"] Mar 13 15:28:05 crc kubenswrapper[4898]: I0313 15:28:05.954600 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556922-p8rbd"] Mar 13 15:28:07 crc kubenswrapper[4898]: I0313 15:28:07.751546 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af39c392-cb6d-4afc-837c-9cbf245a9856" path="/var/lib/kubelet/pods/af39c392-cb6d-4afc-837c-9cbf245a9856/volumes" Mar 13 15:28:08 crc kubenswrapper[4898]: I0313 15:28:08.375576 4898 generic.go:334] "Generic (PLEG): container finished" podID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerID="b15545ebc323988813650859c1417c533406bfededaf18c21507a8ec9723c2af" exitCode=0 Mar 13 15:28:08 crc kubenswrapper[4898]: I0313 15:28:08.375660 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbz9v" event={"ID":"eb8014b4-2211-4ce2-93ec-3a496a563a8c","Type":"ContainerDied","Data":"b15545ebc323988813650859c1417c533406bfededaf18c21507a8ec9723c2af"} Mar 13 15:28:10 crc kubenswrapper[4898]: I0313 15:28:10.399518 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbz9v" event={"ID":"eb8014b4-2211-4ce2-93ec-3a496a563a8c","Type":"ContainerStarted","Data":"501957783fc2b9643d6e59dc7c7a2674654e9caa4d3ce18aa694243b54940186"} Mar 13 15:28:10 crc kubenswrapper[4898]: I0313 15:28:10.428943 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dbz9v" podStartSLOduration=3.545501042 podStartE2EDuration="13.428911525s" podCreationTimestamp="2026-03-13 15:27:57 +0000 UTC" firstStartedPulling="2026-03-13 15:27:59.243574369 +0000 UTC m=+5514.245162608" lastFinishedPulling="2026-03-13 15:28:09.126984852 +0000 UTC m=+5524.128573091" observedRunningTime="2026-03-13 15:28:10.418841563 +0000 UTC m=+5525.420429802" watchObservedRunningTime="2026-03-13 15:28:10.428911525 +0000 UTC m=+5525.430499764" Mar 13 15:28:17 crc kubenswrapper[4898]: I0313 15:28:17.565521 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:28:17 crc kubenswrapper[4898]: I0313 15:28:17.566161 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:28:18 crc kubenswrapper[4898]: I0313 15:28:18.661969 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dbz9v" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="registry-server" probeResult="failure" output=< Mar 13 15:28:18 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:28:18 crc kubenswrapper[4898]: > Mar 13 15:28:19 crc kubenswrapper[4898]: I0313 15:28:19.134237 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:28:19 crc kubenswrapper[4898]: I0313 15:28:19.134722 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:28:28 crc kubenswrapper[4898]: I0313 15:28:28.627033 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dbz9v" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="registry-server" probeResult="failure" output=< Mar 13 15:28:28 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:28:28 crc kubenswrapper[4898]: > Mar 13 15:28:38 crc kubenswrapper[4898]: I0313 15:28:38.624822 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dbz9v" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="registry-server" probeResult="failure" output=< Mar 13 15:28:38 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:28:38 crc kubenswrapper[4898]: > Mar 13 15:28:47 crc kubenswrapper[4898]: I0313 15:28:47.636050 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:28:47 crc kubenswrapper[4898]: I0313 15:28:47.699004 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.134381 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.134687 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.134741 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.135725 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f1eb02ceee77301060a512e8cdb70aa1cab3b74525898dd1df91250d09e1006"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.135795 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://3f1eb02ceee77301060a512e8cdb70aa1cab3b74525898dd1df91250d09e1006" gracePeriod=600 Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.695834 4898 scope.go:117] "RemoveContainer" containerID="038b2fc06591bca90016e0206f4da45f7623dddf7ed529280975231bc7adf587" Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.858891 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="3f1eb02ceee77301060a512e8cdb70aa1cab3b74525898dd1df91250d09e1006" exitCode=0 Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.858931 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"3f1eb02ceee77301060a512e8cdb70aa1cab3b74525898dd1df91250d09e1006"} Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.858984 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:28:50 crc kubenswrapper[4898]: I0313 15:28:50.872786 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"cee6ecdff3780e9dd70ed8f417a9b0b2ef52de112ea7b7ca374180dcff44bba7"} Mar 13 15:28:52 crc kubenswrapper[4898]: I0313 15:28:52.661761 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dbz9v"] Mar 13 15:28:52 crc kubenswrapper[4898]: I0313 15:28:52.663434 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dbz9v" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="registry-server" containerID="cri-o://501957783fc2b9643d6e59dc7c7a2674654e9caa4d3ce18aa694243b54940186" gracePeriod=2 Mar 13 15:28:53 crc kubenswrapper[4898]: I0313 15:28:53.909834 4898 generic.go:334] "Generic (PLEG): container finished" podID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerID="501957783fc2b9643d6e59dc7c7a2674654e9caa4d3ce18aa694243b54940186" exitCode=0 Mar 13 15:28:53 crc kubenswrapper[4898]: I0313 15:28:53.909974 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbz9v" event={"ID":"eb8014b4-2211-4ce2-93ec-3a496a563a8c","Type":"ContainerDied","Data":"501957783fc2b9643d6e59dc7c7a2674654e9caa4d3ce18aa694243b54940186"} Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.766210 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.917870 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-utilities\") pod \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.918695 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-utilities" (OuterVolumeSpecName: "utilities") pod "eb8014b4-2211-4ce2-93ec-3a496a563a8c" (UID: "eb8014b4-2211-4ce2-93ec-3a496a563a8c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.919046 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-catalog-content\") pod \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.920136 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7trwd\" (UniqueName: \"kubernetes.io/projected/eb8014b4-2211-4ce2-93ec-3a496a563a8c-kube-api-access-7trwd\") pod \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.921723 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.923951 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbz9v" event={"ID":"eb8014b4-2211-4ce2-93ec-3a496a563a8c","Type":"ContainerDied","Data":"dfcf7ad7032f43188bd7aba9617ccc0074fb1fd0e3010930b63236240ebde712"} Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.924003 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.924066 4898 scope.go:117] "RemoveContainer" containerID="501957783fc2b9643d6e59dc7c7a2674654e9caa4d3ce18aa694243b54940186" Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.936446 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb8014b4-2211-4ce2-93ec-3a496a563a8c-kube-api-access-7trwd" (OuterVolumeSpecName: "kube-api-access-7trwd") pod "eb8014b4-2211-4ce2-93ec-3a496a563a8c" (UID: "eb8014b4-2211-4ce2-93ec-3a496a563a8c"). InnerVolumeSpecName "kube-api-access-7trwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.946618 4898 scope.go:117] "RemoveContainer" containerID="b15545ebc323988813650859c1417c533406bfededaf18c21507a8ec9723c2af" Mar 13 15:28:55 crc kubenswrapper[4898]: I0313 15:28:55.003446 4898 scope.go:117] "RemoveContainer" containerID="b1bdcb936ed1f82a907dadb3f319631253f8ad7fb9f25e178be06f7c87b0893e" Mar 13 15:28:55 crc kubenswrapper[4898]: I0313 15:28:55.025279 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7trwd\" (UniqueName: \"kubernetes.io/projected/eb8014b4-2211-4ce2-93ec-3a496a563a8c-kube-api-access-7trwd\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:55 crc kubenswrapper[4898]: I0313 15:28:55.068362 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb8014b4-2211-4ce2-93ec-3a496a563a8c" (UID: "eb8014b4-2211-4ce2-93ec-3a496a563a8c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:55 crc kubenswrapper[4898]: I0313 15:28:55.128459 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:55 crc kubenswrapper[4898]: I0313 15:28:55.270429 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dbz9v"] Mar 13 15:28:55 crc kubenswrapper[4898]: I0313 15:28:55.283580 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dbz9v"] Mar 13 15:28:55 crc kubenswrapper[4898]: I0313 15:28:55.753158 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" path="/var/lib/kubelet/pods/eb8014b4-2211-4ce2-93ec-3a496a563a8c/volumes" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.151869 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556930-f8ms7"] Mar 13 15:30:00 crc kubenswrapper[4898]: E0313 15:30:00.153076 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="extract-content" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.153095 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="extract-content" Mar 13 15:30:00 crc kubenswrapper[4898]: E0313 15:30:00.153128 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="registry-server" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.153137 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="registry-server" Mar 13 15:30:00 crc kubenswrapper[4898]: E0313 15:30:00.153165 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff3a636-04a3-4ec5-8cd2-da3adf44d084" containerName="oc" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.153174 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff3a636-04a3-4ec5-8cd2-da3adf44d084" containerName="oc" Mar 13 15:30:00 crc kubenswrapper[4898]: E0313 15:30:00.153190 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="extract-utilities" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.153198 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="extract-utilities" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.153575 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="registry-server" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.153632 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ff3a636-04a3-4ec5-8cd2-da3adf44d084" containerName="oc" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.157279 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.159910 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.160060 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.160199 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.167659 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f"] Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.170495 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.174363 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.174537 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.182157 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556930-f8ms7"] Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.190067 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f"] Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.249699 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqq69\" (UniqueName: \"kubernetes.io/projected/98cb3d53-de77-4344-8045-41653ba912a9-kube-api-access-bqq69\") pod \"auto-csr-approver-29556930-f8ms7\" (UID: \"98cb3d53-de77-4344-8045-41653ba912a9\") " pod="openshift-infra/auto-csr-approver-29556930-f8ms7" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.351386 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqq69\" (UniqueName: \"kubernetes.io/projected/98cb3d53-de77-4344-8045-41653ba912a9-kube-api-access-bqq69\") pod \"auto-csr-approver-29556930-f8ms7\" (UID: \"98cb3d53-de77-4344-8045-41653ba912a9\") " pod="openshift-infra/auto-csr-approver-29556930-f8ms7" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.351517 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11e4bb07-0526-42fe-80c5-6bed7db79d16-secret-volume\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.351567 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwckx\" (UniqueName: \"kubernetes.io/projected/11e4bb07-0526-42fe-80c5-6bed7db79d16-kube-api-access-qwckx\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.351691 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11e4bb07-0526-42fe-80c5-6bed7db79d16-config-volume\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.377015 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqq69\" (UniqueName: \"kubernetes.io/projected/98cb3d53-de77-4344-8045-41653ba912a9-kube-api-access-bqq69\") pod \"auto-csr-approver-29556930-f8ms7\" (UID: \"98cb3d53-de77-4344-8045-41653ba912a9\") " pod="openshift-infra/auto-csr-approver-29556930-f8ms7" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.454417 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11e4bb07-0526-42fe-80c5-6bed7db79d16-config-volume\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.454655 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11e4bb07-0526-42fe-80c5-6bed7db79d16-secret-volume\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.454720 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwckx\" (UniqueName: \"kubernetes.io/projected/11e4bb07-0526-42fe-80c5-6bed7db79d16-kube-api-access-qwckx\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.455246 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11e4bb07-0526-42fe-80c5-6bed7db79d16-config-volume\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.459961 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11e4bb07-0526-42fe-80c5-6bed7db79d16-secret-volume\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.470317 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwckx\" (UniqueName: \"kubernetes.io/projected/11e4bb07-0526-42fe-80c5-6bed7db79d16-kube-api-access-qwckx\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.484266 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.495420 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:01 crc kubenswrapper[4898]: I0313 15:30:01.014244 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556930-f8ms7"] Mar 13 15:30:01 crc kubenswrapper[4898]: W0313 15:30:01.014806 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11e4bb07_0526_42fe_80c5_6bed7db79d16.slice/crio-85621c637bdcd5e55220cee87918a755a34817d2327b4fcbb1244b7a5704b341 WatchSource:0}: Error finding container 85621c637bdcd5e55220cee87918a755a34817d2327b4fcbb1244b7a5704b341: Status 404 returned error can't find the container with id 85621c637bdcd5e55220cee87918a755a34817d2327b4fcbb1244b7a5704b341 Mar 13 15:30:01 crc kubenswrapper[4898]: I0313 15:30:01.028505 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f"] Mar 13 15:30:01 crc kubenswrapper[4898]: I0313 15:30:01.786244 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" event={"ID":"11e4bb07-0526-42fe-80c5-6bed7db79d16","Type":"ContainerStarted","Data":"a25345c405dfefa76e519a8784a708d6344c4490388c27312a458e7e659dc8c9"} Mar 13 15:30:01 crc kubenswrapper[4898]: I0313 15:30:01.786525 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" event={"ID":"11e4bb07-0526-42fe-80c5-6bed7db79d16","Type":"ContainerStarted","Data":"85621c637bdcd5e55220cee87918a755a34817d2327b4fcbb1244b7a5704b341"} Mar 13 15:30:01 crc kubenswrapper[4898]: I0313 15:30:01.787987 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" event={"ID":"98cb3d53-de77-4344-8045-41653ba912a9","Type":"ContainerStarted","Data":"cd8ee8aac0748f9e11c55221957bf5b61afe83bbf5d7802c2716a842c4b3f9e7"} Mar 13 15:30:01 crc kubenswrapper[4898]: I0313 15:30:01.816139 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" podStartSLOduration=1.816117204 podStartE2EDuration="1.816117204s" podCreationTimestamp="2026-03-13 15:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:30:01.806564856 +0000 UTC m=+5636.808153095" watchObservedRunningTime="2026-03-13 15:30:01.816117204 +0000 UTC m=+5636.817705463" Mar 13 15:30:02 crc kubenswrapper[4898]: I0313 15:30:02.802041 4898 generic.go:334] "Generic (PLEG): container finished" podID="11e4bb07-0526-42fe-80c5-6bed7db79d16" containerID="a25345c405dfefa76e519a8784a708d6344c4490388c27312a458e7e659dc8c9" exitCode=0 Mar 13 15:30:02 crc kubenswrapper[4898]: I0313 15:30:02.802243 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" event={"ID":"11e4bb07-0526-42fe-80c5-6bed7db79d16","Type":"ContainerDied","Data":"a25345c405dfefa76e519a8784a708d6344c4490388c27312a458e7e659dc8c9"} Mar 13 15:30:02 crc kubenswrapper[4898]: I0313 15:30:02.804356 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" event={"ID":"98cb3d53-de77-4344-8045-41653ba912a9","Type":"ContainerStarted","Data":"b77b5806113bddb5a814673aafb563d048919833f2e203dc225d6b5382a9eff2"} Mar 13 15:30:02 crc kubenswrapper[4898]: I0313 15:30:02.849126 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" podStartSLOduration=1.605041651 podStartE2EDuration="2.849104505s" podCreationTimestamp="2026-03-13 15:30:00 +0000 UTC" firstStartedPulling="2026-03-13 15:30:01.015338931 +0000 UTC m=+5636.016927170" lastFinishedPulling="2026-03-13 15:30:02.259401785 +0000 UTC m=+5637.260990024" observedRunningTime="2026-03-13 15:30:02.838396888 +0000 UTC m=+5637.839985177" watchObservedRunningTime="2026-03-13 15:30:02.849104505 +0000 UTC m=+5637.850692744" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.242244 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.349442 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11e4bb07-0526-42fe-80c5-6bed7db79d16-secret-volume\") pod \"11e4bb07-0526-42fe-80c5-6bed7db79d16\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.349494 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwckx\" (UniqueName: \"kubernetes.io/projected/11e4bb07-0526-42fe-80c5-6bed7db79d16-kube-api-access-qwckx\") pod \"11e4bb07-0526-42fe-80c5-6bed7db79d16\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.349607 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11e4bb07-0526-42fe-80c5-6bed7db79d16-config-volume\") pod \"11e4bb07-0526-42fe-80c5-6bed7db79d16\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.351164 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e4bb07-0526-42fe-80c5-6bed7db79d16-config-volume" (OuterVolumeSpecName: "config-volume") pod "11e4bb07-0526-42fe-80c5-6bed7db79d16" (UID: "11e4bb07-0526-42fe-80c5-6bed7db79d16"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.356959 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e4bb07-0526-42fe-80c5-6bed7db79d16-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "11e4bb07-0526-42fe-80c5-6bed7db79d16" (UID: "11e4bb07-0526-42fe-80c5-6bed7db79d16"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.358254 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11e4bb07-0526-42fe-80c5-6bed7db79d16-kube-api-access-qwckx" (OuterVolumeSpecName: "kube-api-access-qwckx") pod "11e4bb07-0526-42fe-80c5-6bed7db79d16" (UID: "11e4bb07-0526-42fe-80c5-6bed7db79d16"). InnerVolumeSpecName "kube-api-access-qwckx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.452974 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwckx\" (UniqueName: \"kubernetes.io/projected/11e4bb07-0526-42fe-80c5-6bed7db79d16-kube-api-access-qwckx\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.453011 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11e4bb07-0526-42fe-80c5-6bed7db79d16-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.453021 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11e4bb07-0526-42fe-80c5-6bed7db79d16-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.830813 4898 generic.go:334] "Generic (PLEG): container finished" podID="98cb3d53-de77-4344-8045-41653ba912a9" containerID="b77b5806113bddb5a814673aafb563d048919833f2e203dc225d6b5382a9eff2" exitCode=0 Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.830870 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" event={"ID":"98cb3d53-de77-4344-8045-41653ba912a9","Type":"ContainerDied","Data":"b77b5806113bddb5a814673aafb563d048919833f2e203dc225d6b5382a9eff2"} Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.841329 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" event={"ID":"11e4bb07-0526-42fe-80c5-6bed7db79d16","Type":"ContainerDied","Data":"85621c637bdcd5e55220cee87918a755a34817d2327b4fcbb1244b7a5704b341"} Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.841593 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85621c637bdcd5e55220cee87918a755a34817d2327b4fcbb1244b7a5704b341" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.841389 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.918415 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6"] Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.932522 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6"] Mar 13 15:30:06 crc kubenswrapper[4898]: I0313 15:30:06.246642 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1711d9ce-262c-4c6c-930a-4148e62fae9e" path="/var/lib/kubelet/pods/1711d9ce-262c-4c6c-930a-4148e62fae9e/volumes" Mar 13 15:30:06 crc kubenswrapper[4898]: I0313 15:30:06.639839 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" Mar 13 15:30:06 crc kubenswrapper[4898]: I0313 15:30:06.789073 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqq69\" (UniqueName: \"kubernetes.io/projected/98cb3d53-de77-4344-8045-41653ba912a9-kube-api-access-bqq69\") pod \"98cb3d53-de77-4344-8045-41653ba912a9\" (UID: \"98cb3d53-de77-4344-8045-41653ba912a9\") " Mar 13 15:30:06 crc kubenswrapper[4898]: I0313 15:30:06.798121 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98cb3d53-de77-4344-8045-41653ba912a9-kube-api-access-bqq69" (OuterVolumeSpecName: "kube-api-access-bqq69") pod "98cb3d53-de77-4344-8045-41653ba912a9" (UID: "98cb3d53-de77-4344-8045-41653ba912a9"). InnerVolumeSpecName "kube-api-access-bqq69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:30:06 crc kubenswrapper[4898]: I0313 15:30:06.892348 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqq69\" (UniqueName: \"kubernetes.io/projected/98cb3d53-de77-4344-8045-41653ba912a9-kube-api-access-bqq69\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:06 crc kubenswrapper[4898]: I0313 15:30:06.913489 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556924-vg49w"] Mar 13 15:30:06 crc kubenswrapper[4898]: I0313 15:30:06.926376 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556924-vg49w"] Mar 13 15:30:07 crc kubenswrapper[4898]: I0313 15:30:07.219296 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" event={"ID":"98cb3d53-de77-4344-8045-41653ba912a9","Type":"ContainerDied","Data":"cd8ee8aac0748f9e11c55221957bf5b61afe83bbf5d7802c2716a842c4b3f9e7"} Mar 13 15:30:07 crc kubenswrapper[4898]: I0313 15:30:07.219344 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" Mar 13 15:30:07 crc kubenswrapper[4898]: I0313 15:30:07.219363 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd8ee8aac0748f9e11c55221957bf5b61afe83bbf5d7802c2716a842c4b3f9e7" Mar 13 15:30:07 crc kubenswrapper[4898]: I0313 15:30:07.765467 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b1a5763-109f-4888-97bb-eeb7cd25ff69" path="/var/lib/kubelet/pods/3b1a5763-109f-4888-97bb-eeb7cd25ff69/volumes" Mar 13 15:30:49 crc kubenswrapper[4898]: I0313 15:30:49.955039 4898 scope.go:117] "RemoveContainer" containerID="c8ba8026ba786ec97dc1c956429b165e911caa22ae98facccf3eabf821d09223" Mar 13 15:30:50 crc kubenswrapper[4898]: I0313 15:30:50.019111 4898 scope.go:117] "RemoveContainer" containerID="586dd830bc412bf8d165f328ec0120d6ccafcd1b6e8c6a0642a7f4464c15681b" Mar 13 15:31:19 crc kubenswrapper[4898]: I0313 15:31:19.134259 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:31:19 crc kubenswrapper[4898]: I0313 15:31:19.134778 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:31:49 crc kubenswrapper[4898]: I0313 15:31:49.134947 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:31:49 crc kubenswrapper[4898]: I0313 15:31:49.136546 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.156960 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556932-tq9z4"] Mar 13 15:32:00 crc kubenswrapper[4898]: E0313 15:32:00.158572 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e4bb07-0526-42fe-80c5-6bed7db79d16" containerName="collect-profiles" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.158595 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e4bb07-0526-42fe-80c5-6bed7db79d16" containerName="collect-profiles" Mar 13 15:32:00 crc kubenswrapper[4898]: E0313 15:32:00.158685 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98cb3d53-de77-4344-8045-41653ba912a9" containerName="oc" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.158695 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="98cb3d53-de77-4344-8045-41653ba912a9" containerName="oc" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.158982 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="11e4bb07-0526-42fe-80c5-6bed7db79d16" containerName="collect-profiles" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.159013 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="98cb3d53-de77-4344-8045-41653ba912a9" containerName="oc" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.160304 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556932-tq9z4" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.162323 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.164482 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.164723 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.187730 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556932-tq9z4"] Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.209832 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlrpj\" (UniqueName: \"kubernetes.io/projected/b9389a03-1af1-48d5-b5d3-0d8e886d5469-kube-api-access-mlrpj\") pod \"auto-csr-approver-29556932-tq9z4\" (UID: \"b9389a03-1af1-48d5-b5d3-0d8e886d5469\") " pod="openshift-infra/auto-csr-approver-29556932-tq9z4" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.312305 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlrpj\" (UniqueName: \"kubernetes.io/projected/b9389a03-1af1-48d5-b5d3-0d8e886d5469-kube-api-access-mlrpj\") pod \"auto-csr-approver-29556932-tq9z4\" (UID: \"b9389a03-1af1-48d5-b5d3-0d8e886d5469\") " pod="openshift-infra/auto-csr-approver-29556932-tq9z4" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.332500 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlrpj\" (UniqueName: \"kubernetes.io/projected/b9389a03-1af1-48d5-b5d3-0d8e886d5469-kube-api-access-mlrpj\") pod \"auto-csr-approver-29556932-tq9z4\" (UID: \"b9389a03-1af1-48d5-b5d3-0d8e886d5469\") " pod="openshift-infra/auto-csr-approver-29556932-tq9z4" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.483703 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556932-tq9z4" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.980256 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556932-tq9z4"] Mar 13 15:32:00 crc kubenswrapper[4898]: W0313 15:32:00.986090 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9389a03_1af1_48d5_b5d3_0d8e886d5469.slice/crio-f1d613b9130da4d268b8f6569b6227ccc4f9cce4fee9d288aed9dcd36b5f67dc WatchSource:0}: Error finding container f1d613b9130da4d268b8f6569b6227ccc4f9cce4fee9d288aed9dcd36b5f67dc: Status 404 returned error can't find the container with id f1d613b9130da4d268b8f6569b6227ccc4f9cce4fee9d288aed9dcd36b5f67dc Mar 13 15:32:01 crc kubenswrapper[4898]: I0313 15:32:01.635057 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556932-tq9z4" event={"ID":"b9389a03-1af1-48d5-b5d3-0d8e886d5469","Type":"ContainerStarted","Data":"f1d613b9130da4d268b8f6569b6227ccc4f9cce4fee9d288aed9dcd36b5f67dc"} Mar 13 15:32:03 crc kubenswrapper[4898]: I0313 15:32:03.672094 4898 generic.go:334] "Generic (PLEG): container finished" podID="b9389a03-1af1-48d5-b5d3-0d8e886d5469" containerID="35e5bf6a54e528120937c5fa4cd67e6484f1ca2384f69de7e5c4a635b6cdfdc6" exitCode=0 Mar 13 15:32:03 crc kubenswrapper[4898]: I0313 15:32:03.672195 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556932-tq9z4" event={"ID":"b9389a03-1af1-48d5-b5d3-0d8e886d5469","Type":"ContainerDied","Data":"35e5bf6a54e528120937c5fa4cd67e6484f1ca2384f69de7e5c4a635b6cdfdc6"} Mar 13 15:32:05 crc kubenswrapper[4898]: I0313 15:32:05.316298 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556932-tq9z4" Mar 13 15:32:05 crc kubenswrapper[4898]: I0313 15:32:05.440040 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlrpj\" (UniqueName: \"kubernetes.io/projected/b9389a03-1af1-48d5-b5d3-0d8e886d5469-kube-api-access-mlrpj\") pod \"b9389a03-1af1-48d5-b5d3-0d8e886d5469\" (UID: \"b9389a03-1af1-48d5-b5d3-0d8e886d5469\") " Mar 13 15:32:05 crc kubenswrapper[4898]: I0313 15:32:05.457970 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9389a03-1af1-48d5-b5d3-0d8e886d5469-kube-api-access-mlrpj" (OuterVolumeSpecName: "kube-api-access-mlrpj") pod "b9389a03-1af1-48d5-b5d3-0d8e886d5469" (UID: "b9389a03-1af1-48d5-b5d3-0d8e886d5469"). InnerVolumeSpecName "kube-api-access-mlrpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:32:05 crc kubenswrapper[4898]: I0313 15:32:05.542762 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlrpj\" (UniqueName: \"kubernetes.io/projected/b9389a03-1af1-48d5-b5d3-0d8e886d5469-kube-api-access-mlrpj\") on node \"crc\" DevicePath \"\"" Mar 13 15:32:05 crc kubenswrapper[4898]: I0313 15:32:05.702992 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556932-tq9z4" event={"ID":"b9389a03-1af1-48d5-b5d3-0d8e886d5469","Type":"ContainerDied","Data":"f1d613b9130da4d268b8f6569b6227ccc4f9cce4fee9d288aed9dcd36b5f67dc"} Mar 13 15:32:05 crc kubenswrapper[4898]: I0313 15:32:05.703331 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1d613b9130da4d268b8f6569b6227ccc4f9cce4fee9d288aed9dcd36b5f67dc" Mar 13 15:32:05 crc kubenswrapper[4898]: I0313 15:32:05.703233 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556932-tq9z4" Mar 13 15:32:06 crc kubenswrapper[4898]: I0313 15:32:06.408122 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556926-znpr2"] Mar 13 15:32:06 crc kubenswrapper[4898]: I0313 15:32:06.421810 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556926-znpr2"] Mar 13 15:32:07 crc kubenswrapper[4898]: I0313 15:32:07.753228 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcf6d966-2758-453f-9308-fd452766462b" path="/var/lib/kubelet/pods/fcf6d966-2758-453f-9308-fd452766462b/volumes" Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.134511 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.136638 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.136949 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.138647 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cee6ecdff3780e9dd70ed8f417a9b0b2ef52de112ea7b7ca374180dcff44bba7"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.138970 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://cee6ecdff3780e9dd70ed8f417a9b0b2ef52de112ea7b7ca374180dcff44bba7" gracePeriod=600 Mar 13 15:32:19 crc kubenswrapper[4898]: E0313 15:32:19.274283 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.894012 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="cee6ecdff3780e9dd70ed8f417a9b0b2ef52de112ea7b7ca374180dcff44bba7" exitCode=0 Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.894064 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"cee6ecdff3780e9dd70ed8f417a9b0b2ef52de112ea7b7ca374180dcff44bba7"} Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.894102 4898 scope.go:117] "RemoveContainer" containerID="3f1eb02ceee77301060a512e8cdb70aa1cab3b74525898dd1df91250d09e1006" Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.894962 4898 scope.go:117] "RemoveContainer" containerID="cee6ecdff3780e9dd70ed8f417a9b0b2ef52de112ea7b7ca374180dcff44bba7" Mar 13 15:32:19 crc kubenswrapper[4898]: E0313 15:32:19.895311 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:32:30 crc kubenswrapper[4898]: I0313 15:32:30.739554 4898 scope.go:117] "RemoveContainer" containerID="cee6ecdff3780e9dd70ed8f417a9b0b2ef52de112ea7b7ca374180dcff44bba7" Mar 13 15:32:30 crc kubenswrapper[4898]: E0313 15:32:30.740497 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515155027044024450 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015155027045017366 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015155013240016501 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015155013240015451 5ustar corecore